hi:
been away awhile on other (hardware) projects, greetings to all!
i have an ascii download (in a somewhat compressed format) that i must extract data from.
when i import the file i need to lookup data from several tables to do an append to a
master table. a sample of the import table (abbreviated) is shown below:
field_1 005
field_2 007
.
.
.
field_10 234
field_1 is the truck_idnumber
field_2 is the driver_id number
.
.
.
field_10 is the route_id number
after the import is complete and filtered i need to lookup the actual data from
tables that contain the fleet id of the trucks and the employee number of
the drivers etc. below is again an abbreviated portion of the script:
tbl2=table.open("trucks")
query.filter="truck_id=truck"
query.order=""
qry2=tbl2.query_create()
tbl2.fetch_first()
fleet_id=tbl2.fleet_id 'fleet id is fetched from the trucks table
tbl2.close()
tbl3=table.open("drivers")
query.filter="driver_id=driver"
query.order=""
qry3=tbl3.query_create()
tbl3.fetch_first()
employee_num=tbl3.employee_num 'employee number is fetched from the drivers table
tbl3.close()
'
'
'
tbl4=table.open("routes")
query.filter="route_id=route"
query.order=""
qry4=tbl4.query_create()
tbl4.fetch_first()
route_num=tbl4.route_num 'route number is fetched from the routes table
tbl4.close()
this is very hard drive intensive as there are about 16000 records that get sorted each time
and if ol' mr. hard drive has any weak spots i'm afraid .....
i thought of using variables and loading them from all the lookup tables at the start but this
would require about 1500 variables.
any suggestions on how to speed up this arduous process? the hardware infrastructre is in place
and cannot be altered so i'm limited to trying to resolve the issue within A5.
best regards,
ed
been away awhile on other (hardware) projects, greetings to all!
i have an ascii download (in a somewhat compressed format) that i must extract data from.
when i import the file i need to lookup data from several tables to do an append to a
master table. a sample of the import table (abbreviated) is shown below:
field_1 005
field_2 007
.
.
.
field_10 234
field_1 is the truck_idnumber
field_2 is the driver_id number
.
.
.
field_10 is the route_id number
after the import is complete and filtered i need to lookup the actual data from
tables that contain the fleet id of the trucks and the employee number of
the drivers etc. below is again an abbreviated portion of the script:
tbl2=table.open("trucks")
query.filter="truck_id=truck"
query.order=""
qry2=tbl2.query_create()
tbl2.fetch_first()
fleet_id=tbl2.fleet_id 'fleet id is fetched from the trucks table
tbl2.close()
tbl3=table.open("drivers")
query.filter="driver_id=driver"
query.order=""
qry3=tbl3.query_create()
tbl3.fetch_first()
employee_num=tbl3.employee_num 'employee number is fetched from the drivers table
tbl3.close()
'
'
'
tbl4=table.open("routes")
query.filter="route_id=route"
query.order=""
qry4=tbl4.query_create()
tbl4.fetch_first()
route_num=tbl4.route_num 'route number is fetched from the routes table
tbl4.close()
this is very hard drive intensive as there are about 16000 records that get sorted each time
and if ol' mr. hard drive has any weak spots i'm afraid .....
i thought of using variables and loading them from all the lookup tables at the start but this
would require about 1500 variables.
any suggestions on how to speed up this arduous process? the hardware infrastructre is in place
and cannot be altered so i'm limited to trying to resolve the issue within A5.
best regards,
ed
Comment