Alpha Software Mobile Development Tools:   Alpha Anywhere    |   Alpha TransForm subscribe to our YouTube Channel  Follow Us on LinkedIn  Follow Us on Twitter  Follow Us on Facebook



The Alpha Software Forum Participation Guidelines

The Alpha Software Forum is a free forum created for Alpha Software Developer Community to ask for help, exchange ideas, and share solutions. Alpha Software strives to create an environment where all members of the community can feel safe to participate. In order to ensure the Alpha Software Forum is a place where all feel welcome, forum participants are expected to behave as follows:
  • Be professional in your conduct
  • Be kind to others
  • Be constructive when giving feedback
  • Be open to new ideas and suggestions
  • Stay on topic

Be sure all comments and threads you post are respectful. Posts that contain any of the following content will be considered a violation of your agreement as a member of the Alpha Software Forum Community and will be moderated:
  • Spam.
  • Vulgar language.
  • Quotes from private conversations without permission, including pricing and other sales related discussions.
  • Personal attacks, insults, or subtle put-downs.
  • Harassment, bullying, threatening, mocking, shaming, or deriding anyone.
  • Sexist, racist, homophobic, transphobic, ableist, or otherwise discriminatory jokes and language.
  • Sexually explicit or violent material, links, or language.
  • Pirated, hacked, or copyright-infringing material.
  • Encouraging of others to engage in the above behaviors.

If a thread or post is found to contain any of the content outlined above, a moderator may choose to take one of the following actions:
  • Remove the Post or Thread - the content is removed from the forum.
  • Place the User in Moderation - all posts and new threads must be approved by a moderator before they are posted.
  • Temporarily Ban the User - user is banned from forum for a period of time.
  • Permanently Ban the User - user is permanently banned from the forum.

Moderators may also rename posts and threads if they are too generic or do not property reflect the content.

Moderators may move threads if they have been posted in the incorrect forum.

Threads/Posts questioning specific moderator decisions or actions (such as "why was a user banned?") are not allowed and will be removed.

The owners of Alpha Software Corporation (Forum Owner) reserve the right to remove, edit, move, or close any thread for any reason; or ban any forum member without notice, reason, or explanation.

Community members are encouraged to click the "Report Post" icon in the lower left of a given post if they feel the post is in violation of the rules. This will alert the Moderators to take a look.

Alpha Software Corporation may amend the guidelines from time to time and may also vary the procedures it sets out where appropriate in a particular case. Your agreement to comply with the guidelines will be deemed agreement to any changes to it.

Bonus TIPS for Successful Posting

Try a Search First
It is highly recommended that a Search be done on your topic before posting, as many questions have been answered in prior posts. As with any search engine, the shorter the search term, the more "hits" will be returned, but the more specific the search term is, the greater the relevance of those "hits". Searching for "table" might well return every message on the board while "tablesum" would greatly restrict the number of messages returned.

When you do post
First, make sure you are posting your question in the correct forum. For example, if you post an issue regarding Desktop applications on the Mobile & Browser Applications board , not only will your question not be seen by the appropriate audience, it may also be removed or relocated.

The more detail you provide about your problem or question, the more likely someone is to understand your request and be able to help. A sample database with a minimum of records (and its support files, zipped together) will make it much easier to diagnose issues with your application. Screen shots of error messages are especially helpful.

When explaining how to reproduce your problem, please be as detailed as possible. Describe every step, click-by-click and keypress-by-keypress. Otherwise when others try to duplicate your problem, they may do something slightly different and end up with different results.

A note about attachments
You may only attach one file to each message. Attachment file size is limited to 2MB. If you need to include several files, you may do so by zipping them into a single archive.

If you forgot to attach your files to your post, please do NOT create a new thread. Instead, reply to your original message and attach the file there.

When attaching screen shots, it is best to attach an image file (.BMP, .JPG, .GIF, .PNG, etc.) or a zip file of several images, as opposed to a Word document containing the screen shots. Because Word documents are prone to viruses, many message board users will not open your Word file, therefore limiting their ability to help you.

Similarly, if you are uploading a zipped archive, you should simply create a .ZIP file and not a self-extracting .EXE as many users will not run your EXE file.
See more
See less

Multiple Users Running Same Post/Append Operations Between Same Tables at Same Exact Time

  • Filter
  • Time
  • Show
Clear All
new posts

  • Multiple Users Running Same Post/Append Operations Between Same Tables at Same Exact Time

    I'm using DBF tables, Is there any risk of data loss, corruption, or anything worth considering if I have multiple users running the very same post-operation or append operation between the very same tables at the very same time?
    Each user would be transferring their own unique set of data, but the Master Table, Transaction Table, and Operations would be the very same ones, If I have multiple users lined up and I said "ready, set, go!" Then, they all run the same "Append Operation" at precisely the same moment, is there a risk that some users may succeed while the rest will fail? Will some data go through while some gets lost or skipped, will the tables get corrupted etc etc etc. Do I need some code to ensure the tables are not being used before executing the operation, so only one user's command will get executed at any particular moment?

  • #2
    I personally have a system that has a governor table that holds user logins and script sequence names for script sequence such as you describe. When a user initiates the script, there is a check to the governor table to detect whether the sequence is in use by another user. If the sequence is being used by another user, it alerts the user the sequence is busy. Otherwise, it is open for use, the user's login and the script sequence being executed is written to the governor table to stop another user from "colliding", and the sequence is executed. At the completion of the sequence, the user's name and the sequence executed are deleted from the governor table making it open for use by another user. This works very well.
    Mike W
    "I rebel in at least small things to express to the world that I have not completely surrendered"


    • #3
      Thanks for the response Mark.

      While I understand the concept of what you said, I have no idea how to pull that off.
      Based on your response, should I assume that you are confirming that, it would indeed end up badly if I have multiple users going at it all at once?


      • #4
        You can always add to your script a check to see if the post or append was done.
        Also you could check for exclusive access & if not available set for a retry with a message - Awaiting exclusive access with countdown / try again button.
        Generally i find posts & appends do not require exclusive access & work very well with multiple users, but that is always dependent on what you are posting / appending.


        • #5
          Nigel, Alpha Five operations are designed (and should be used) for batch processing of multiple records in circumstances where no one else is using the master table. These "operations" are saved to the operations tab of the Alpha control panel. Is batch processing of this sort what you have in mind?

          It's possible you're using the term "operation" loosely to refer to single record transactions that change values in a target table, or enter a new record in a target table. This is not what Alpha means when it uses the term "operation". If this is what you have in mind, consider not using the term "operation" in your posts, its confusing. Readers will assume you're talking about batch processing of multiple records from a transaction table to a master table, instead of individual record inserts or changes. A very different thing.


          • #6
            Hey Tom, unless I am mistaking due to limited knowledge, I meant exactly what I said, and, I explained it repeatedly in the original post.

            I am talking about moving multiple records from one table to another in a batch. This would be done with an actual "Append Operation" from the Operations tab in the control panel.

            I have been developing a POS software for the past 7-8 years, I was NEVER pleased with its speed, I've have posted many topics for different parts of it over the years, Just very lately, thanks to this forum and you guys, I learned how to use the profiler to see the execution time of each line of code. I now realize that, for example, with the scanning of items at the cashier, each item scanned takes about a second (sometimes longer) to be located in the "Products" tabled and added to the "Invoice_Items" table. While a "second" may sound very fast, for a POS software it is actually very slow. With "Quickbooks POS" and other pro softwares, the items are added as fast as the cashiers can move, but, with my software, I often watch the cashiers scan an item and have to wait a moment before being able to scan the next one. This is the case whit the single cashier that uses this software, and, the cashier's PC is the server. I cannot allow additional cashiers because the time over the network would just make you suicidal.
            The profiler function made me realize that the bulk of the time being used, is not due to the speed of querying the products like I previously thought, but rather, the speed at which they are added to the "Invoice_Items" table. Out of about 100 lines of code, the following two lines alone take over half of a second to execute.

            The "Invoice_Items" table is over 800,000 records and counting. I'm now thinking of using 2 temp tables, a temporary "Invoice_Header" and a temporary "Invoice_Items". The only records in these 2 tables at any time, would be the currently opened invoice.
            I assume that this would make all transactions satisfactorily faster. Upon ending the transaction, the records would get transferred from the 2 temp tables to the permanent tables via an Append Operation. If this method can and does work, then scanning products would become faster and it would even allow additional cashiers over the network.

            This is why I asked the original question. I'd like to know how practical it would be to have multiple cashiers constantly appending their invoices from the 2 temp tables to the 2 permanent tables, there's no doubt that there would be 2 or more cashiers doing this at precisely the same moment several times each day.


            • #7
              Thank, Nigel. I apologize for missing your previous explanations. Did not mean to offend.

              Your post yesterday provided many useful details.

              ' Brainstorming turned on ...

              1) Might it be useful to substantially shrink the invoice_items table by archiving older transations?

              2) Might it be useful to reduce the number of indextags defined for the invoice hdr and items tables?

              3) Are inventory records being "updated" when items are sold? i.e. qty on hand being reduced?

              4) Are other tables being "updated" when items are committed? i.e. running totals for repeat customers?

              5) Opening and closing tables over the network is always going to be among the slowest things Alpha does. Might it be possible to shift the load to the local workstations, and then update the server overnight? I'm imagining a situation where your price list / inventory data is loaded locally to each workstation each night. The workstations enter transactions and print receipts using local tables. No interation with the server occurs during the workday. Nightly the workstation data is uploaded to the server. Inventory and other adjustments are made overnight. And new price list / inventory data is sent back down to each workstation to get ready for the next day. Is this crazy?

              * Brainstorming off ...


              • #8
                Nigel, building on a couple of points you and Tom have made.
                Expensive software like QB probably uses IMDB ( In Memory Database ) and C# or similar.
                If you can handle the transactions for a customer locally, that will be better than over a network.
                The stock check issue won't matter as the item is in the cashiers hand.
                You could collect the data to append and either send it between customers or do it at close of business.
                The problem of price lists may need looking into as well.

                I'd probably try and break the process up a bit so you have Front Office and Back Office.
                Having done this for the jewellery trade, local at machine level worked best.
                Ted Giles
                Example Consulting - UK

                See our site for Alpha Support, Conversion and Upgrade.


                • #9
                  Hello Nigel,

                  A number of years ago I wrote a POS app using a bar code scanner. We had 2 work stations running at checkout. Admittedly, the lookup table may have contained only 1000 records.

                  That said, I believe I’m using a different approach than you.
                  1. When it comes time to start the check out, the form is placed into enter mode.
                  2. In the field rules, the field I use where the bar code is entered, I checked the box “Auto advance after field filled. In my case, all the bar code values are the same length.
                  3. I then created a hidden field that was set as the next field in the tab order. The OnArrive event then handled the entire process of adding the record to the items table.
                  4. The script does more than just looking up the product. When the script finished, the form is again in enter mode with the cursor in the barcode field awaiting the next scan.

                  It’s lightning fast with no lag at all. In fact, it ran so fast, you will see that I had to add xbasic_wait_for_idle(.1) in 2 places to slow the code down.

                  Here is the OnArrive event code:

                  'Date Created: 08-Aug-2012 10:09:47 PM
                  'Last Updated: 04-Nov-2014 09:34:56 PM
                  'Created By : Ron
                  'Updated By : Ron
                  'Get 'Value' property of 'Id' in Form 'Items_Sold'.
                  DIM SHARED vcTag_No AS c
                  vcTag_No = parentform:Tag_No.value
                  dim vcItem as C
                  dim vcDesc as C
                  'dim vcSize as C
                  dim vnPrice as N
                  vcItem = lookup("item","Tag_no="+s_quote(vcTag_No),"item")
                  vcDesc = lookup("item","Tag_no="+s_quote(vcTag_No),"Description")
                  'vcSize = lookup("item","Tag_no="+s_quote(vcTag_No),"Size")
                  vnprice = lookup("item","Tag_No="+s_quote(vcTag_No),"price")
                  'Set 'Value' property of multiple objects in Form.
                  parentform:Control_browse1:Sold.value = .T.
                  parentform:Control_browse1:Price.value = vnPrice
                  parentform:Control_browse1:Item.value = vcItem
                  parentform:Control_browse1:Description.value = vcDesc
                  'parentform:Control_browse1:Size.value = vcSize
                  parentform:Time_Stamp.value = now()
                  parentform:Invoiceno0.value = topparent:invoiceno.value
                  dim t as P
                  dim indx as P
                  t ="item")
                  indx = t.index_primary_put("Tag_No")
                  rec = t.fetch_find(vcTag_No)
                  t.sold = .T.
                  'Save record in current form at parent level.
                  vcSound = a5.Get_Path()+"\\"+"Record"
                  play_sound(vcSound, UI_SYSTEM_SOUND) ' + UI_WAIT_SOUND)
                  'Execute inline Xbasic code.
                  'Enter new record in current form at browse level.
                  'Execute inline Xbasic code.
                  'Activate object 'Id' in current form.
                  To the right of the TAG_NO field is the hidden field whose OnArrive event runs the above code.


                  Hope this helps.
                  Attached Files
                  Alpha Anywhere v3.5
                  Version 12.3, Build 2999
                  MySql56 & DBF's

                  Alpha 4, 5, Anywhere

                  Ron Anusiewicz


                  • #10
                    Thanks Tom, we're good.
                    1) Thinking of the hassle of preparing forms and reports to work with archives, I wouldn't jump at that as a solution, but, it is possible.
                    2) I don't have that many indexes, the ones I do have are very important for both the cashier and back-office needs.
                    3) Yes, the inventory is being updated, not via field rules, but instead via Operations at the closing of each transaction.
                    4) Conditionally, "yes", other tables may get updated. For example, Customers.dbf gets updated upon closing each credit transaction.
                    5)The opening and closing of tables are not currently being done over the network since there is always only 1 single casher and the Cashier's PC is the server.
                    I have asked for assistance before regarding how to process data on each local PC and then transferring them to the server later, but no one even gave me a reason to believe this was possible, I ended up assuming this was not possible. One issue that I would run into with that if I did make it work is, I use auto-incremented fields for my "Invoice #", So, I would probably open up myself to conflicting Invoice # if I was to start processing data locally and transferring them afterward.

                    Since my single cashier is using the server, my issue goes beyond your suggestion, because, as is right now, the cashier is on the server and still it is too slow.

                    From what I can understand of your codes, that would still leave me with the slow process of entering the new record and saving it, that is where my code spend the most time.

                    Was it a bad idea to use a few calculated fields within the table's field rules?
                    The following are calculated fields that are defined in the field rules for the "Invoice_Items" table.
                    Price = If(tax=.t.,Unit_Price + Var->Tax_Amount,Unit_Price)
                    Extended Price = Price x Quantity


                    • #11

                      I’m afraid you didn’t look at my response carefully.

                      Right or wrong, my solution is done at the form level. What you do not see in the form attachment is a button called “Start Scanning”. That button puts the form into the enter mode and moves the cursor to the TAG_NO field where the barcode is entered. That means the table is already open. As soon as the barcode field is filled, the cursor automatically moves to the hidden field where the OnArrive event code is executed. The hidden field is the small square to the right of the TAG_NO field. Before the code is finished, the record is committed and the form is put back into the enter mode waiting for the next barcode to be scanned. When the user is done scanning, they push the “Finished Scanning” button which puts the form into the view mode.

                      There may be a better methodology for doing this checkout process. With my small lookup table, I can enter a barcode as fast as I can scan. I just do not know how it will work with a large lookup table.

                      I suggest you build a test form utilizing my example for guidance and test it. It’s a lot easier to build a test form to experiment with, perfect, and then apply the necessary changes to the real form if the results are what you want.

                      Not sure how well it will work in a multi user environment. However; this app has been run every year for a weekend since 2012 using two computers, scanning a low volume of items, to do check outs with no problems.

                      From your previous post a month ago or more, your script appears to be run from a button on a form. In fact, I’m not sure how you are triggering the script. That script is doing all the work behind the scenes.

                      Alpha Anywhere v3.5
                      Version 12.3, Build 2999
                      MySql56 & DBF's

                      Alpha 4, 5, Anywhere

                      Ron Anusiewicz


                      • #12
                        The calculated fields are fine but add to the process.
                        Try putting the calc on the form and see.
                        Ted Giles
                        Example Consulting - UK

                        See our site for Alpha Support, Conversion and Upgrade.


                        • #13
                          I have look over your code again, as well as read your explanation a few times now.
                          Would you say the secret to your speed is to keep the table in "enter mode" as opposed to entering the new record after running a scan and locating the desired product?

                          Please forgive me if I ask some silly questions here, but sometimes I'm just not sure of certain things, and the help file is so tremendously large that I usually don't find what I seek when I seek it. I tend to accidentally stumble upon topics while looking for other things.
                          • Can you explicitly say that calculated fields on a form will result in better speed?
                          • I believe I saw once that calculated fields inside the table are not constantly calculated, but rather, only calculated when data changes, is this correct?
                          • is there any need to be concerned with the table re-calculating data for past records, or, does it only calculate the current record and disregard other records from the past?


                          • #14
                            Nigel, from the Help Text, this is what causes a Calculated Field to be updated;

                            A record is added
                            One of the calculated fields components is changed
                            Re-Evaluate Rules is triggered by the user

                            In V10.5 there were issues with speed where a number of Calc Fields on a Form were used, and there is a workaround, so unless there are a lot, probably won't affect things much.

                            "each item scanned takes about a second (sometimes longer) to be located in the "Products" tabled and added to the "Invoice_Items" table."

                            Can you share the construction of the process?
                            Is this done in a Set or through Variables?
                            How long are the Field Names in the tables? I ask this as I have seen issues where the first characters in fields are similar, viz;



                            Over 8 char in the name and it all slows down.

                            Ted Giles
                            Example Consulting - UK

                            See our site for Alpha Support, Conversion and Upgrade.


                            • #15
                              Over 8 char in the name and it all slows down
                              Ups, I thougth 10 char is the limit.