This is for you xbasic gurus, I guess.
I have two tables (dbf) the first contains only a few fields and a couple hundred lines, basically a term and an associated code that goes with that term.
The second table is about 700,000+ items that has many fields but basically one field could possible contain a term from the first table and if so I need to insert the
associated code for that term, in another field. Just to make things interesting there can be several items within the target field so that the only way I have found to
isolate terms is to use "search for word(s)" in the Query Genie (e.g. containsi() in xbasic). I need to rip thru all of the codes file against every line item in the
master file.
So I thought I would just run the Query Genie and enter a couple terms and see the generated xbasic to see what the most efficient method to scan the
file would be. Imagine that.. you can capture the xbasic on the filter, but I cannot figure out how to capture the complete xbasic for the process to apply
against the entire table. I understand that it would return a result that would then have to be read thru anyway and I do not need to see a browse or form.
So next I thought about reading up the terms into an array and spin the array/per master file record and do it all in one process (upon term match grab associated
code and stuff'n go)... but wondered if there was yet a more efficient method then running a couple hundred items against 700,000+ lines.
Anyone care to share their ideas based on the above? This has to be done quarterly and worst case monthly and even once is one time to many to think about doing manually, in any case.
Regards,
Keith
I have two tables (dbf) the first contains only a few fields and a couple hundred lines, basically a term and an associated code that goes with that term.
The second table is about 700,000+ items that has many fields but basically one field could possible contain a term from the first table and if so I need to insert the
associated code for that term, in another field. Just to make things interesting there can be several items within the target field so that the only way I have found to
isolate terms is to use "search for word(s)" in the Query Genie (e.g. containsi() in xbasic). I need to rip thru all of the codes file against every line item in the
master file.
So I thought I would just run the Query Genie and enter a couple terms and see the generated xbasic to see what the most efficient method to scan the
file would be. Imagine that.. you can capture the xbasic on the filter, but I cannot figure out how to capture the complete xbasic for the process to apply
against the entire table. I understand that it would return a result that would then have to be read thru anyway and I do not need to see a browse or form.
So next I thought about reading up the terms into an array and spin the array/per master file record and do it all in one process (upon term match grab associated
code and stuff'n go)... but wondered if there was yet a more efficient method then running a couple hundred items against 700,000+ lines.
Anyone care to share their ideas based on the above? This has to be done quarterly and worst case monthly and even once is one time to many to think about doing manually, in any case.
Regards,
Keith
Comment