Alpha Software Mobile Development Tools:   Alpha Anywhere    |   Alpha TransForm subscribe to our YouTube Channel  Follow Us on LinkedIn  Follow Us on Twitter  Follow Us on Facebook

Announcement

Collapse

The Alpha Software Forum Participation Guidelines

The Alpha Software Forum is a free forum created for Alpha Software Developer Community to ask for help, exchange ideas, and share solutions. Alpha Software strives to create an environment where all members of the community can feel safe to participate. In order to ensure the Alpha Software Forum is a place where all feel welcome, forum participants are expected to behave as follows:
  • Be professional in your conduct
  • Be kind to others
  • Be constructive when giving feedback
  • Be open to new ideas and suggestions
  • Stay on topic


Be sure all comments and threads you post are respectful. Posts that contain any of the following content will be considered a violation of your agreement as a member of the Alpha Software Forum Community and will be moderated:
  • Spam.
  • Vulgar language.
  • Quotes from private conversations without permission, including pricing and other sales related discussions.
  • Personal attacks, insults, or subtle put-downs.
  • Harassment, bullying, threatening, mocking, shaming, or deriding anyone.
  • Sexist, racist, homophobic, transphobic, ableist, or otherwise discriminatory jokes and language.
  • Sexually explicit or violent material, links, or language.
  • Pirated, hacked, or copyright-infringing material.
  • Encouraging of others to engage in the above behaviors.


If a thread or post is found to contain any of the content outlined above, a moderator may choose to take one of the following actions:
  • Remove the Post or Thread - the content is removed from the forum.
  • Place the User in Moderation - all posts and new threads must be approved by a moderator before they are posted.
  • Temporarily Ban the User - user is banned from forum for a period of time.
  • Permanently Ban the User - user is permanently banned from the forum.


Moderators may also rename posts and threads if they are too generic or do not property reflect the content.

Moderators may move threads if they have been posted in the incorrect forum.

Threads/Posts questioning specific moderator decisions or actions (such as "why was a user banned?") are not allowed and will be removed.

The owners of Alpha Software Corporation (Forum Owner) reserve the right to remove, edit, move, or close any thread for any reason; or ban any forum member without notice, reason, or explanation.

Community members are encouraged to click the "Report Post" icon in the lower left of a given post if they feel the post is in violation of the rules. This will alert the Moderators to take a look.

Alpha Software Corporation may amend the guidelines from time to time and may also vary the procedures it sets out where appropriate in a particular case. Your agreement to comply with the guidelines will be deemed agreement to any changes to it.



Bonus TIPS for Successful Posting

Try a Search First
It is highly recommended that a Search be done on your topic before posting, as many questions have been answered in prior posts. As with any search engine, the shorter the search term, the more "hits" will be returned, but the more specific the search term is, the greater the relevance of those "hits". Searching for "table" might well return every message on the board while "tablesum" would greatly restrict the number of messages returned.

When you do post
First, make sure you are posting your question in the correct forum. For example, if you post an issue regarding Desktop applications on the Mobile & Browser Applications board , not only will your question not be seen by the appropriate audience, it may also be removed or relocated.

The more detail you provide about your problem or question, the more likely someone is to understand your request and be able to help. A sample database with a minimum of records (and its support files, zipped together) will make it much easier to diagnose issues with your application. Screen shots of error messages are especially helpful.

When explaining how to reproduce your problem, please be as detailed as possible. Describe every step, click-by-click and keypress-by-keypress. Otherwise when others try to duplicate your problem, they may do something slightly different and end up with different results.

A note about attachments
You may only attach one file to each message. Attachment file size is limited to 2MB. If you need to include several files, you may do so by zipping them into a single archive.

If you forgot to attach your files to your post, please do NOT create a new thread. Instead, reply to your original message and attach the file there.

When attaching screen shots, it is best to attach an image file (.BMP, .JPG, .GIF, .PNG, etc.) or a zip file of several images, as opposed to a Word document containing the screen shots. Because Word documents are prone to viruses, many message board users will not open your Word file, therefore limiting their ability to help you.

Similarly, if you are uploading a zipped archive, you should simply create a .ZIP file and not a self-extracting .EXE as many users will not run your EXE file.
See more
See less

How can I end up with less record with delete duplicates operation?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    How can I end up with less record with delete duplicates operation?

    I receive daily downloads from a vendor that does installation services for my company. The installation events accumulated in their system as they are reported and are reported to me in End of Day (EOD) spreadsheet that contains a rolling 5 day window of data. I collect the spreadsheet reports and run the collected group every 3-4 days. Beginning with a fresh (emptied) temp table, I have a script that imports the spreadsheet data one-spreadsheet-at-a-time into a the temp table and then runs an operation to delete duplicates. Somehow after executing the script on the second spreadsheet, I have fewer records. At the conclusion of running the script and importing the first spreadsheet into the empty and packed table, there are 4572 records. After running the import script on the second spreadsheet I have 4518 records. What could have possibly caused the deletion of records if I am just deleting duplicates?
    Mike W
    __________________________
    "I rebel in at least small things to express to the world that I have not completely surrendered"

    #2
    Re: How can I end up with less record with delete duplicates operation?

    Hi Mike,
    Have you tried testing the spreadsheet saved as a dbf and running AA's duplicate genie on it (without actually deleting anything yet) before you import it?
    Robin

    Discernment is not needed in things that differ, but in those things that appear to be the same. - Miles Sanford

    Comment


      #3
      Re: How can I end up with less record with delete duplicates operation?

      Hi Robin,
      I can't figure it out. I'm trying all kinds of things to sort through this. I'm changing approaches and see if I can achieve what I need with an alternate strategy. Because this one is not working.
      Mike W
      __________________________
      "I rebel in at least small things to express to the world that I have not completely surrendered"

      Comment


        #4
        Re: How can I end up with less record with delete duplicates operation?

        My thought was that your routine may be finding duplicates in the spreadsheet itself, therefore if you test the spreadsheet with the duplicate genie it will show you how many it finds.
        Robin

        Discernment is not needed in things that differ, but in those things that appear to be the same. - Miles Sanford

        Comment


          #5
          Re: How can I end up with less record with delete duplicates operation?

          What are you matching to identify a duplicate?
          In one instance I had to create a calculated field comprising 8 other fields to act as the test element.
          I think in the standard Alpha you are limited to 3 fields?
          See our Hybrid Option here;
          https://hybridapps.example-software.com/


          Apologies to anyone I haven't managed to upset yet.
          You are held in a queue and I will get to you soon.

          Comment


            #6
            Re: How can I end up with less record with delete duplicates operation?

            I would create a test spreadsheet with no more than 10 rows of data.
            Include some duplicates to test your routine.
            Once you have it working as expected on the small sample, you should
            be able to apply it to the full table.
            Gregg
            https://paiza.io is a great site to test and share sql code

            Comment


              #7
              Re: How can I end up with less record with delete duplicates operation?

              Gregg, checking for duplicates isn't an easy matter, as I found out when checking social care records.
              I understand what you are saying, but that is why I asked what fields were being tested.

              Example;

              Adam Smith and Smith Adam
              This was a duplicate as the user was putting stuff in back to front. It is not a case of comparing just two items of data as you are probably aware, but until I got deeply into the stupidity of data entry and badly designed systems, I wasn't aware just how complex it can be.

              So I had to check;
              First Name
              Last Name
              DoB
              Nat Ins Number
              Post Code
              NHS Number
              First line of address

              I once had a battle which I lost, where there were twins with very long first names with the only difference being the last letter.
              Everything pointed to a dupe, but it wasn't.
              See our Hybrid Option here;
              https://hybridapps.example-software.com/


              Apologies to anyone I haven't managed to upset yet.
              You are held in a queue and I will get to you soon.

              Comment


                #8
                Re: How can I end up with less record with delete duplicates operation?

                Based on what you just explained, you're not only checking for duplicates, you're checking for user error, which can really get trick as we know there are people with "2 first names".
                but however you do it, I would most likely dump the data into a spreadsheet , then write a simple formula to show me possible duplicates.
                I would start with a small number of records just because it's easier to see the dups right away as you test the data.
                Gregg
                https://paiza.io is a great site to test and share sql code

                Comment


                  #9
                  Re: How can I end up with less record with delete duplicates operation?

                  Did you fix this Mike?
                  See our Hybrid Option here;
                  https://hybridapps.example-software.com/


                  Apologies to anyone I haven't managed to upset yet.
                  You are held in a queue and I will get to you soon.

                  Comment


                    #10
                    Re: How can I end up with less record with delete duplicates operation?

                    Hi Guys,
                    Sorry, I was away for the first time in over 5 years on an actual vacation/holiday! I did not fix or figure out the original issue, I just took a different approach. I was originally checking for and deleting duplicates with each iteration of a spreadsheet import trying to get the table unique before the next spreadsheet import. This was behaving badly as I described so I decided to import all data completely into a temp table and do a single delete duplicates at the finale. I had to write as script to examine the full temp table to see how may duplicates existed to QC the result was correct, and that approach worked out to be successful reducing the 109,873 imported records to 9877 unique records without data loss or residual duplicates.
                    Mike W
                    __________________________
                    "I rebel in at least small things to express to the world that I have not completely surrendered"

                    Comment


                      #11
                      Re: How can I end up with less record with delete duplicates operation?

                      Glad my suggestion helped...
                      Robin

                      Discernment is not needed in things that differ, but in those things that appear to be the same. - Miles Sanford

                      Comment

                      Working...
                      X