Alpha Software Mobile Development Tools:   Alpha Anywhere    |   Alpha TransForm subscribe to our YouTube Channel  Follow Us on LinkedIn  Follow Us on Twitter  Follow Us on Facebook

Announcement

Collapse

The Alpha Software Forum Participation Guidelines

The Alpha Software Forum is a free forum created for Alpha Software Developer Community to ask for help, exchange ideas, and share solutions. Alpha Software strives to create an environment where all members of the community can feel safe to participate. In order to ensure the Alpha Software Forum is a place where all feel welcome, forum participants are expected to behave as follows:
  • Be professional in your conduct
  • Be kind to others
  • Be constructive when giving feedback
  • Be open to new ideas and suggestions
  • Stay on topic


Be sure all comments and threads you post are respectful. Posts that contain any of the following content will be considered a violation of your agreement as a member of the Alpha Software Forum Community and will be moderated:
  • Spam.
  • Vulgar language.
  • Quotes from private conversations without permission, including pricing and other sales related discussions.
  • Personal attacks, insults, or subtle put-downs.
  • Harassment, bullying, threatening, mocking, shaming, or deriding anyone.
  • Sexist, racist, homophobic, transphobic, ableist, or otherwise discriminatory jokes and language.
  • Sexually explicit or violent material, links, or language.
  • Pirated, hacked, or copyright-infringing material.
  • Encouraging of others to engage in the above behaviors.


If a thread or post is found to contain any of the content outlined above, a moderator may choose to take one of the following actions:
  • Remove the Post or Thread - the content is removed from the forum.
  • Place the User in Moderation - all posts and new threads must be approved by a moderator before they are posted.
  • Temporarily Ban the User - user is banned from forum for a period of time.
  • Permanently Ban the User - user is permanently banned from the forum.


Moderators may also rename posts and threads if they are too generic or do not property reflect the content.

Moderators may move threads if they have been posted in the incorrect forum.

Threads/Posts questioning specific moderator decisions or actions (such as "why was a user banned?") are not allowed and will be removed.

The owners of Alpha Software Corporation (Forum Owner) reserve the right to remove, edit, move, or close any thread for any reason; or ban any forum member without notice, reason, or explanation.

Community members are encouraged to click the "Report Post" icon in the lower left of a given post if they feel the post is in violation of the rules. This will alert the Moderators to take a look.

Alpha Software Corporation may amend the guidelines from time to time and may also vary the procedures it sets out where appropriate in a particular case. Your agreement to comply with the guidelines will be deemed agreement to any changes to it.



Bonus TIPS for Successful Posting

Try a Search First
It is highly recommended that a Search be done on your topic before posting, as many questions have been answered in prior posts. As with any search engine, the shorter the search term, the more "hits" will be returned, but the more specific the search term is, the greater the relevance of those "hits". Searching for "table" might well return every message on the board while "tablesum" would greatly restrict the number of messages returned.

When you do post
First, make sure you are posting your question in the correct forum. For example, if you post an issue regarding Desktop applications on the Mobile & Browser Applications board , not only will your question not be seen by the appropriate audience, it may also be removed or relocated.

The more detail you provide about your problem or question, the more likely someone is to understand your request and be able to help. A sample database with a minimum of records (and its support files, zipped together) will make it much easier to diagnose issues with your application. Screen shots of error messages are especially helpful.

When explaining how to reproduce your problem, please be as detailed as possible. Describe every step, click-by-click and keypress-by-keypress. Otherwise when others try to duplicate your problem, they may do something slightly different and end up with different results.

A note about attachments
You may only attach one file to each message. Attachment file size is limited to 2MB. If you need to include several files, you may do so by zipping them into a single archive.

If you forgot to attach your files to your post, please do NOT create a new thread. Instead, reply to your original message and attach the file there.

When attaching screen shots, it is best to attach an image file (.BMP, .JPG, .GIF, .PNG, etc.) or a zip file of several images, as opposed to a Word document containing the screen shots. Because Word documents are prone to viruses, many message board users will not open your Word file, therefore limiting their ability to help you.

Similarly, if you are uploading a zipped archive, you should simply create a .ZIP file and not a self-extracting .EXE as many users will not run your EXE file.
See more
See less

Use Xbasic To Import to DBF then loop through DBF to apply conditional SQL updates & inserts

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #16
    Re: Use Xbasic To Import to DBF then loop through DBF to apply conditional SQL updates & inserts

    This is an excellent thread with high class content.
    From my perspective, it makes a nice change to see the originator (Paul) engaging and evaluating.
    Too often we see threads chopped off without any feedback.
    Thanks guys.
    See our Hybrid Option here;
    https://hybridapps.example-software.com/


    Apologies to anyone I haven't managed to upset yet.
    You are held in a queue and I will get to you soon.

    Comment


      #17
      Re: Use Xbasic To Import to DBF then loop through DBF to apply conditional SQL updates & inserts

      Originally posted by Al Buchholz View Post
      I googled PostgreSQL Import csv and found the copy command.

      sounds pretty straight forward.
      Al, this does seem pretty straight forward and I suspect that I'll try to use this for certain things in the future, however, in this particular experiment, the final destination SQL table WAS NOT in the same layout as the CSV file, therefore, a direct copy to the final destination is not and will not be an option for me. In fact, in other files, I have to read in a CSV and extract individual pieces of data from the rows and then post them to an SQL table in a format completely incompatible with the original CSV. A direct copy method will not give me the flexibility to perform both format and content conversions on the data as it moves through the import process. The script that I wrote gives me functionality to reverse a text field, parse the inbound text anyway I want and even do datatype conversions during the import process. More importantly, while I have admitted that the script will most likely only be used by me to perform the cut-over once the client is ready to go live, what I left unsaid is that by automating the conversion process, I can re-run the import/conversion as many times as is necessary to completely redesign and normalize the client's current data in a way will benefit them greatly as we move toward a web-enabled app down the road.
      Paul H. Tarver
      Tarver Program Consultants, Inc.
      www.TPCQPC.com
      www.TPCDataWorks.com

      If you are doing today, what you were doing three years ago; you are backing up, not standing still.

      Comment


        #18
        Re: Use Xbasic To Import to DBF then loop through DBF to apply conditional SQL updates & inserts

        Thanks Ted Giles. It was my thought that perhaps this thread would be useful to some other VFP programmers who might be interested in moving to AA. One of the ways I learn the best, is to see code snippets that perform some function, then tear it apart and use the bits I need to make my own solution. But I believe it works best with real world examples. That was one of the things that has always been awesome about the VFP community has been their willingness to share code and I'm seeing a similar trend here with AA, but everyone seems to be so focused on the web-app side of things that us old school desktop dev guys (who are trying to catch up!) get a little lost while trying to transition to AA.

        I have to say that I have been extremely impressed with the willingness AA people to at least offer suggestions and try help. It really is a positive endorsement of the AA community to see that happen.

        Thanks!
        PHT
        Paul H. Tarver
        Tarver Program Consultants, Inc.
        www.TPCQPC.com
        www.TPCDataWorks.com

        If you are doing today, what you were doing three years ago; you are backing up, not standing still.

        Comment


          #19
          Re: Use Xbasic To Import to DBF then loop through DBF to apply conditional SQL updates & inserts

          You haven't mentioned the volume of data that is involved. Performance may be an issue depending on the volume.

          When I do this kind of thing, I always script it since it'll be run multiple times to test (and improve).

          I'm thinking you import the data from the copy to a 'temp' table, then script the data to transform and move to the new table with SQL/procedure on the server.
          Al Buchholz
          Bookwood Systems, LTD
          Weekly QReportBuilder Webinars Thursday 1 pm CST

          Occam's Razor - KISS
          Normalize till it hurts - De-normalize till it works.
          Advice offered and questions asked in the spirit of learning how to fish is better than someone giving you a fish.
          When we triage a problem it is much easier to read sample systems than to read a mind.
          "Make it as simple as possible, but not simpler."
          Albert Einstein

          http://www.iadn.com/images/media/iadn_member.png

          Comment


            #20
            Re: Use Xbasic To Import to DBF then loop through DBF to apply conditional SQL updates & inserts

            Originally posted by Al Buchholz View Post
            You haven't mentioned the volume of data that is involved. Performance may be an issue depending on the volume.
            Al, the member table has over 7,000 records in it, but there are child tables one of which contains over 200,000 activity records. I'll know more about the performance issues once I've finished the other imports. Perhaps I'll do the same performance tests I ran earlier once all the data is load and I'll let you know if there are any differences in load times.
            Paul H. Tarver
            Tarver Program Consultants, Inc.
            www.TPCQPC.com
            www.TPCDataWorks.com

            If you are doing today, what you were doing three years ago; you are backing up, not standing still.

            Comment


              #21
              Re: Use Xbasic To Import to DBF then loop through DBF to apply conditional SQL updates & inserts

              Originally posted by ptarver View Post
              Al, the member table has over 7,000 records in it, but there are child tables one of which contains over 200,000 activity records. I'll know more about the performance issues once I've finished the other imports. Perhaps I'll do the same performance tests I ran earlier once all the data is load and I'll let you know if there are any differences in load times.
              relatively small data sample. Probably won't make much difference in whatever method you use.
              Al Buchholz
              Bookwood Systems, LTD
              Weekly QReportBuilder Webinars Thursday 1 pm CST

              Occam's Razor - KISS
              Normalize till it hurts - De-normalize till it works.
              Advice offered and questions asked in the spirit of learning how to fish is better than someone giving you a fish.
              When we triage a problem it is much easier to read sample systems than to read a mind.
              "Make it as simple as possible, but not simpler."
              Albert Einstein

              http://www.iadn.com/images/media/iadn_member.png

              Comment


                #22
                Re: Use Xbasic To Import to DBF then loop through DBF to apply conditional SQL updates & inserts

                Originally posted by ptarver View Post
                Steve, I understand your concern about mingling, but truthful, this exercise is simply going to used to migrate a customer's btrieve data to an SQL database and if we are lucky it will only be used once for this client. I would perhaps build something different for another client that was migrating (perhaps after I learn more!), but the challenge right now is building the new database structure and populate it with live data as quickly as possible. Secondarily, I'm not worried about using Desktop only UI elements because our initial migration for this client is from a DOS-based networked application to a Windows-based network application. Long term, they want to add a web-based front-end to allow remote users to update the database with the bulk of their intranet work remaining on the desktop application. Maybe its not what is best, but it is what their budget and timeline will support.
                I too have this scenario of how to get btrieve data from a DOS networked app using a TAS database. How are you getting the data into a cvs file?
                Robin

                Discernment is not needed in things that differ, but in those things that appear to be the same. - Miles Sanford

                Comment


                  #23
                  Re: Use Xbasic To Import to DBF then loop through DBF to apply conditional SQL updates & inserts

                  Originally posted by MoGrace View Post
                  I too have this scenario of how to get btrieve data from a DOS networked app using a TAS database. How are you getting the data into a cvs file?
                  Robin: I have the advantage of having written the original DOS program using btrieve as the data engine, so it was an easy process for me to get into the program and write some quick and dirty delimited export routines. Unfortunately, I do not have a magic method for pulling data out of btrieve data systems. Sorry.
                  Paul H. Tarver
                  Tarver Program Consultants, Inc.
                  www.TPCQPC.com
                  www.TPCDataWorks.com

                  If you are doing today, what you were doing three years ago; you are backing up, not standing still.

                  Comment


                    #24
                    Re: Use Xbasic To Import to DBF then loop through DBF to apply conditional SQL updates & inserts

                    Originally posted by Al Buchholz View Post
                    relatively small data sample. Probably won't make much difference in whatever method you use.
                    Al: I gave you some incorrect information regarding the record counts I posted earlier. The member table has 67,000 records in it with about 23 separate fields. I'll be splitting this file to normalize the data and to give some flexibility with the data storage in the SQL going forward. I know that is still a small sample of data, but the timeline increased significantly when I went from 499 records to 67,000. The math has started kicking in and is working against me.

                    For example, if I take the highest time for PostgreSQL and the highest time for FirebirdSQL and split the difference as my average time to import/update 500 records, I get about a 1.38 minutes (the actual average is 83 seconds but I divided the 83 by 60 seconds to convert from seconds to hundreds) to complete. If I divide 67,000 total records by 500, I get 134 batches of 500. If each batch takes 1.38 minutes, then the total time would be 134 * 1.38 or 184.92 minutes or at least 3.082 hours just to update the complete member file. Not the fastest method, but workable if I don't do it too often. However, I will be migrating more than one file, so the total time could easily exceed 5-6 hours unless I take the faster route.

                    One thought I had was perhaps deleting all of the records in the SQL table first and then doing pure INSERTS because they are screaming fast compared to field by field comparisons and updates.

                    And here in lies the problem with me as a programmer because I love figuring this stuff out, I will spend three weeks trying to figure out how to optimize the import process to gain just a few seconds of time per batch even though I know that I'll only really be doing this once or twice for the full data migration!
                    Paul H. Tarver
                    Tarver Program Consultants, Inc.
                    www.TPCQPC.com
                    www.TPCDataWorks.com

                    If you are doing today, what you were doing three years ago; you are backing up, not standing still.

                    Comment


                      #25
                      Re: Use Xbasic To Import to DBF then loop through DBF to apply conditional SQL updates & inserts

                      Definitely truncate the table first if the data allows it.

                      Insert only should be much faster.

                      I know you read that active linked tables are slower, but for insert only I'd try that.

                      I've been pleasantly surprised in certain cases.
                      Last edited by Al Buchholz; 07-01-2014, 05:25 PM.
                      Al Buchholz
                      Bookwood Systems, LTD
                      Weekly QReportBuilder Webinars Thursday 1 pm CST

                      Occam's Razor - KISS
                      Normalize till it hurts - De-normalize till it works.
                      Advice offered and questions asked in the spirit of learning how to fish is better than someone giving you a fish.
                      When we triage a problem it is much easier to read sample systems than to read a mind.
                      "Make it as simple as possible, but not simpler."
                      Albert Einstein

                      http://www.iadn.com/images/media/iadn_member.png

                      Comment


                        #26
                        Re: Use Xbasic To Import to DBF then loop through DBF to apply conditional SQL updates & inserts

                        Originally posted by Al Buchholz View Post
                        Definitely truncate the table first if the data allows it. Insert only should be much faster.
                        The script has now been modified to delete all records in the SQL table prior to uploading and the performance is faster. However, as I am continuously updating the database structure and adding new elements, I'm going to lay off benchmarking for a while until I get the structure finalized.

                        The big take-away is that I got the script working and have been tweaking it and improving it to the point that last night I copied the script; made a few modifications and changed it to extract different data and update a different SQL table in the database. So not only is the script I wanted working to solve my original need, it is now easily modifiable and re-usable for other tables I need to upload.
                        Paul H. Tarver
                        Tarver Program Consultants, Inc.
                        www.TPCQPC.com
                        www.TPCDataWorks.com

                        If you are doing today, what you were doing three years ago; you are backing up, not standing still.

                        Comment


                          #27
                          Re: Use Xbasic To Import to DBF then loop through DBF to apply conditional SQL updates & inserts

                          Are you deleting all of the records or using a truncate? Most relational databases have a truncate. It's similar to the DBF zap table.

                          http://www.postgresql.org/docs/9.1/s...-truncate.html

                          TRUNCATE quickly removes all rows from a set of tables. It has the same effect as an unqualified DELETE on each table, but since it does not actually scan the tables it is faster. Furthermore, it reclaims disk space immediately, rather than requiring a subsequent VACUUM operation. This is most useful on large tables.
                          Al Buchholz
                          Bookwood Systems, LTD
                          Weekly QReportBuilder Webinars Thursday 1 pm CST

                          Occam's Razor - KISS
                          Normalize till it hurts - De-normalize till it works.
                          Advice offered and questions asked in the spirit of learning how to fish is better than someone giving you a fish.
                          When we triage a problem it is much easier to read sample systems than to read a mind.
                          "Make it as simple as possible, but not simpler."
                          Albert Einstein

                          http://www.iadn.com/images/media/iadn_member.png

                          Comment


                            #28
                            Re: Use Xbasic To Import to DBF then loop through DBF to apply conditional SQL updates & inserts

                            FirebirdSQL v2.5 doesn't support the TRUNCATE function. However, it does support the 'DELETE FROM members" command and I deleted some 25,000 records on several occasions last night and the operation time was measured in under 1 second.
                            Paul H. Tarver
                            Tarver Program Consultants, Inc.
                            www.TPCQPC.com
                            www.TPCDataWorks.com

                            If you are doing today, what you were doing three years ago; you are backing up, not standing still.

                            Comment


                              #29
                              Re: Use Xbasic To Import to DBF then loop through DBF to apply conditional SQL updates & inserts

                              Paul,

                              A couple years ago I was working with a VFP programmer doing a large scale VFP-->Alpha conversion. This is one function of dozens that "converted' dbf to MySQL (or CSV to MySQL in many cases) vis-a-vis inserts and/or updates. These scripts ran nightly updating live tables. I think this one updated over 2 (4?) million records each night - it took close to 4-hours to run if I remember correctly.

                              Code:
                              'Date Created: 12-Oct-2012 03:20:02 PM
                              'Last Updated: 24-Dec-2012 12:28:20 PM
                              'Created By  : pgreulich
                              'Updated By  : pgreulich
                              
                              FUNCTION fncUpdateArchivedItems AS L ( )
                              
                              ''Delete archived_items records ArcId > 2343000 
                              ''Insert records from ArcItm.dbf (recno() > 2343000) into EntApp:archived_items
                              
                              '' e.g. f_LogSync(1,"Note","Source","Target","Script",Count)
                              ''	 vOp = 1 (Data deleted in Target table.)  
                              ''	 vOp = 2 (Data Import begun.)  	
                              ''	 vOp = 3 (Data Update begun.)  	
                              ''	 vOp = 4 (Data Insert begun.)  		
                              ''	 vOp = 5 (Data import completed.)  
                              ''	 vOp = 6 (Data update completed.)  
                              ''	 vOp = 7 (Data Insert completed.)  
                              
                              ON ERROR GOTO ERROR_HANDLER
                              
                              Dim SQL as c
                              Dim conn as SQL::connection
                              Dim args as SQL::Arguments
                              
                              Dim flag as L
                              Dim errCount as N = 0
                              
                              SQL_Reset = <<%sql%
                              DELETE FROM archived_items WHERE ArcId > 2343000
                              \
                              ALTER TABLE EntApp.archived_items AUTO_INCREMENT = 1 
                              %sql%
                              ''======================================================================================
                              conn.open("::Name::EntApp")
                              flag = conn.execute(SQL_Reset, args)
                              
                              If flag = .t.
                              	f_LogSync(1,"","","archived_items","fncUpdateArchivedItems()")
                              Else
                              	f_LogSync(99,"ERROR: data may NOT have been deleted!","","archived_items","fncUpdateArchivedItems()")	
                              End if
                              
                              vFields= <<%str%
                              ArcTransactionNumber
                              ArcTransactionType
                              ArcTransactionSequence
                              ArcTransactionLine
                              ArcPartNumber
                              ArcOrdered
                              ArcShipped
                              ArcBackOrdered
                              ArcDescription
                              ArcNetPrice
                              ArcCorePrice
                              ArcExchangeType
                              ARCITMCOD
                              ArcGroupCode
                              ArcLocationCode
                              ArcMFGCode
                              ARCITMCTN
                              ARCITMDN
                              ARCITMEL
                              ARCITMPCT
                              ARCITMBOD
                              ARCITMBOP
                              ARCITMSUP
                              ArcCost
                              ARCITMBNK
                              ARCITMMISC
                              ARCITMCTP
                              %str%
                              
                              vArgs= <<%str%
                              :TransactionNumber
                              :TransactionType
                              :TransactionSequence
                              :TransactionLine
                              :PartNumber
                              :Ordered
                              :Shipped
                              :BackOrdered
                              :Description
                              :NetPrice
                              :CorePrice
                              :ExchangeType
                              :ITMCOD
                              :GroupCode
                              :LocationCode
                              :MFGCode
                              :ITMCTN
                              :ITMDN
                              :ITMEL
                              :ITMPCT
                              :ITMBOD
                              :ITMBOP
                              :ITMSUP
                              :Cost
                              :ITMBNK
                              :ITMMISC
                              :ITMCTP
                              %str%
                              
                              vFields = crlf_to_comma(vFields)
                              vArgs = crlf_to_comma(vArgs)
                              SQL_Insert = "INSERT INTO archived_items (" + vFields + ") VALUES (" + vArgs + ")"
                              SQL_count = "SELECT count(*) FROM archived_items"
                              
                              ''Open DBF fetch & loop 
                              ''======================================================================================
                              ' debug(1)
                              tbl = table.open("ArcItm")
                              tbl.index_primary_put("") ''index in record order
                              tbl.fetch_goto(2343000)
                              
                              f_LogSync(4,"","ArcItm.dbf","archived_items","fncUpdateArchivedItems()")
                              i=0
                              tbl.fetch_next()
                              while .not. tbl.fetch_eof()
                              
                              	vTransactionNumber  = tbl.Arcitmdoc
                              	vTransactionType  = tbl.Arcitmtyp
                              	vTransactionSequence  = tbl.Arcitmseq
                              	vTransactionLine  = tbl.Arcitmcnt
                              	vPartNumber  = tbl.Arcitmnum
                              	vOrdered  = tbl.Arcitmord
                              	vShipped  = tbl.Arcitmshp
                              	vBackOrdered  = tbl.Arcitmbo
                              	vDescription  = tbl.Arcitmdsc
                              	vNetPrice  = tbl.Arcitmprc
                              	vCorePrice  = tbl.Arcitmcor
                              	vExchangeType  = tbl.Arcitmexc
                              	
                              	vITMCOD  = tbl.Arcitmcod
                              	
                              	vGroupCode  = tbl.Arcitmgrp
                              	vLocationCode = tbl.Arcitmloc
                              	vMFGCode  = tbl.Arcitmmfg
                              	
                              	vITMCTN  = tbl.Arcitmctn
                              	vITMDN  = tbl.Arcitmdn
                              	vITMEL  = tbl.Arcitmel
                              	vITMPCT  = tbl.Arcitmpct
                              	vITMBOD  = tbl.Arcitmbod
                              	vITMBOP  = tbl.Arcitmbop
                              	vITMSUP  = tbl.Arcitmsup
                              	
                              	vCost  = tbl.Arcitmcst
                              	
                              	vITMBNK  = tbl.Arcitmbnk
                              	vITMMISC  = tbl.Arcitmmisc
                              	vITMCTP  = tbl.Arcitmctp
                              
                              
                              	args.set("TransactionNumber",vTransactionNumber)
                              	args.set("TransactionType",vTransactionType)
                              	args.set("TransactionSequence",vTransactionSequence)
                              	args.set("TransactionLine",vTransactionLine)
                              	args.set("PartNumber",vPartNumber)
                              	args.set("Ordered",vOrdered)
                              	args.set("Shipped",vShipped)
                              	args.set("BackOrdered",vBackOrdered)
                              	args.set("Description",vDescription)
                              	args.set("NetPrice",vNetPrice/10000)
                              	args.set("CorePrice",vCorePrice/10000)
                              	args.set("ExchangeType",vExchangeType)
                              	
                              	args.set("ITMCOD",vITMCOD)
                              	
                              	args.set("GroupCode",alltrim(str(vGroupCode)))
                              	args.set("LocationCode",vLocationCode)
                              	args.set("MFGCode",vMFGCode)
                              	
                              	args.set("ITMCTN",vITMCTN)
                              	args.set("ITMDN",vITMDN)
                              	args.set("ITMEL",vITMEL)
                              	args.set("ITMPCT",vITMPCT)
                              	args.set("ITMBOD",jtodate(vITMBOD+2415020))
                              	args.set("ITMBOP",vITMBOP)
                              	args.set("ITMSUP",vITMSUP)
                              	
                              	args.set("Cost",vCost/10000)
                              	
                              	args.set("ITMBNK",vITMBNK)
                              	args.set("ITMMISC",vITMMISC)
                              	args.set("ITMCTP",vITMCTP)
                              
                              	flag = conn.execute(SQL_Insert, args)
                              	vText = conn.callResult.text
                              	
                              	If flag = .F.
                              		errCount = errCount + 1
                              	End If
                              '	For testing only (100K records take 6 minutes to process)
                              	i=i+1
                              '	IF i >= 1000
                              '		EXIT WHILE
                              '	End if
                              tbl.fetch_next()
                              end while
                              
                              tbl.close()
                              
                              If errCount > 0
                              	f_LogSync(99,"ERROR: Records not inserted.","ArcItm.dbf","archived_items","fncUpdateArchivedItems()",errCount)	
                              End if
                              ''======================================================================================
                              conn.execute(SQL_count, args)
                              rs = conn.ResultSet
                              'count = rs.data(1)
                              
                              conn.close()		
                              
                              f_LogSync(7,"","ArcItm.dbf","archived_items","fncUpdateArchivedItems()",i)
                              
                              fncUpdateArchivedItems = .T.
                              
                              EXIT Function
                              ''-------------------------------
                              ERROR_HANDLER:
                              trace.WriteLn("i = " + alltrim(str(i,10)) + "rec# "+alltrim(str(tbl.recno())))
                              errLine = ltrim(str(error_line_number_get()))
                              errCode = error_code_get()
                              cErrCode = ltrim(str(errCode))
                              errText = error_text_get(errCode)
                              
                              f_LogSync(99,"ERROR @ Line# +"errLine+"ErrorCode# "+cErrCode+" "+errText,"ArcItm.dbf","archived_items","fncUpdateArchivedItems()")	
                              Resume next
                              END FUNCTION
                              Peter
                              AlphaBase Solutions, LLC

                              [email protected]
                              https://www.alphabasesolutions.com


                              Comment


                                #30
                                Re: Use Xbasic To Import to DBF then loop through DBF to apply conditional SQL updates &amp; inserts

                                Originally posted by Peter.Greulich View Post
                                Paul,

                                A couple years ago I was working with a VFP programmer doing a large scale VFP-->Alpha conversion. This is one function of dozens that "converted' dbf to MySQL (or CSV to MySQL in many cases) vis-a-vis inserts and/or updates. These scripts ran nightly updating live tables. I think this one updated over 2 (4?) million records each night - it took close to 4-hours to run if I remember correctly.
                                Wow, Peter that performance is amazing. Can you tell me anything about the hardware/network/server combination that you were using? Just curious.

                                When I reviewed the code, I was reminded that one of the things I wanted to investigate was whether there was a way to programmatically assign the arguments like using an array then looping through the fields of the dbf file to update the arguments and build the SQL statement at the same time. Any thoughts on that possibility? Understand I'm a noob at AA yet, so there may be an obvious reason why that can't be done and if so, be nice! :)
                                Paul H. Tarver
                                Tarver Program Consultants, Inc.
                                www.TPCQPC.com
                                www.TPCDataWorks.com

                                If you are doing today, what you were doing three years ago; you are backing up, not standing still.

                                Comment

                                Working...
                                X