View Full Version : Import from a .DAT file


Louis Nickerson
04-30-2008, 08:30 PM
I have a .dat file (approximately 550 MB).

I can open it in Excel, but because the file contains more than 65535 records, Excel truncates the data...not good.

It is too large for notepad to open.

It will eventually open in Wordpad, but it hangs the PC.

I need to get this data Imported into an Alpha 5 table.

This is disk 1 of 10 and ultimately, I need to import all 10 CDs.

Any help or advice would be most appreciated.


Stan Mathews
04-30-2008, 08:37 PM
Not sure I understand the problem. A .dat file is basically a text file?

If the .dat has an easily discernible structure it should import as if it were a .txt. (uniform throughout the file) If not you will probably need to read it with the file methods and find the markers (words, phrases) that delimit the usable data.

Louis Nickerson
04-30-2008, 10:00 PM

Thank you for replying.

I know the exact structure of each record. A specific number of character spaces is used for each field. If the data in a field does not fill the entire field width then it is paded on the right with the appropriate number of spaces.

I tried to import the file into a table, but Alpha does not recognise the .dat file extension as a type of file that can be imported. I also tried renaming the extension from .dat to .txt but the import Genie didn't see any text to import.

I am attaching two files to this thread. One is a small sampling of the data it is named 010207c.txt (I changed the extension from .dat to .txt so the message board would allow the upload) and the other is a readmetxt file that details what each field is and it's length.

Any suggestions are welcome and appreciated.


Doug Page
04-30-2008, 10:40 PM
I think from a quick look at the file that your biggest problem is that the record length is 1170 characters. I think that typical text file manipulation routines are limited to 1024 characters. The records also contain characters of 00 (zero-zero) which can also cause problems with text manipulation routines.

From your field description file I can see that this is a COBOL file. Having dealt with MicroFocus Cobol files in the past, they could have deleted records in them if the files have not been packed. MF Cobol files are similar to dbf's in that they mark records for deletion. The way they do this is to change the EOL characters from the traditional CR-LF to 00-00. Again, this will cause problems with text file routines as they now won't know where the end of the record is.

In order to deal with this, the best method is to have the client use a report writer to "dump" the data into a more friendly text file. Otherwise, start brushing up on your XBasic and learning the other file options Alpha has to deal with non-text files. You will then have to verify and manipulate the data and save to a table. A lot more work but can be done.

Edit: Also make sure that the client has packed the files. It just makes things so much easier to code on your end.

Louis Nickerson
04-30-2008, 10:52 PM

Thank you for your input.

The data was provided by the State of FLorida. As best I can determine, this is the only format they will provide the data in.

I'm not too worried about undeleted records. This database will ultimately be used to populate marketing data to yet another database. If there is a record in this file that is not found in the marketing database, then there will just be no action taken for that record.

Since I can determine exactly where one record ends and the next begins by counting the number of characters, I was thinking that a consistent record delimiter might not be necessary. I welcome your thoughts on this idea.

Thanks again


04-30-2008, 11:44 PM
I had the same result as you with the renamed dat to txt file you provided--nothing showed up in the genie for import...but then opened it up in notepad, closed it and tried again and everything showed up in the genie and imported without a problem.

imported using ascii and no delimiter--set my own breaks prior to import

Raymond Lyons
05-01-2008, 12:06 AM
Mike beat me to it, but if you throw in more than one record I think you might also have to set it to remove leading blanks. Not sure though cuz my test 2nd record was phony, which might have caused the leading blanks I experienced. Plus I think Mike is talking about using the "Table format - the data in your file is in fixed width columns."


05-01-2008, 12:10 AM

I think Mike is talking about using the "Table format - the data in your file is in fixed width columns.
Yes--it just seemed to be the way it was organized and was the first thing I tried...I hope this works for Louis as getting one small sample to work may not be the same as getting the original dat file working.

Doug Page
05-01-2008, 01:19 AM
Mike and Ray,
You didn't just open the file with Notepad but you also SAVED it as well. When you did, it turned all the 00's to 20's (or spaces). This made it a text file that could now be read by Alpha.

The good news is that it can read past the 1024 characters and read the whole record of 1170 characters. So now the problem just seems to be getting the 00's out of the file and changed to 20's. Your file is 550MB so opening in a text editor is not going to work. I would look into the file.open() and file.read() functions and then manipulate the data from there. For simplicities sake you could just change all the 00's to 20's and then file.write() back to a new filename. Then you should be able to import as normal.

05-01-2008, 01:33 AM
Right---another assumption on my part I guess--made sense to me while doing it and assumed it would be "matter of course" to do it that way! Darned assumptions can really add to confusion sometimes and am glad there are people like you to straighten it out for people like me who do this!! :)

It sure seems to me that I have opened up a file that was over 200 mb once--took a while but seem to remember it opening eventually--largest so far I can find of a file is only just past 50 mb and opens in about 5 seconds....this is with notepad used on winXP sp2. Will keep looking as I cannot find a definitive answer googling for the file limit on notepad--most allude to the old limit of 64 KB.

LATER:just opened a 450 mb file (a cab renamed to txt) in 3 1/2 minutes. So am guessing notepad will open even higher--Are you (everyone) certain that the file cannot be opened by notepad????

Raymond Lyons
05-01-2008, 04:15 AM
I stopped using Notepad years ago so know nothing about its file size limits.

I use NoteTab Pro which will open files up to 2GB. The Pro version is $29. Don't know the limits on the free version but it could be the same.


Stan Mathews
05-01-2008, 09:41 AM
I believe if you change the file association for .dat files to open with notepad they will appear in the Alpha ascii import dialog.

05-01-2008, 09:54 AM
I didn't even have to go that far--just by changing it and opening and closing it once in notepad allowed the file to be seen...prior to opening it the genie was blank. I tried this with an actual .dat file I found and it worked the same way--changed extension to .txt and did not show up in genie. Then opened and closed in notepad (without having the file association changed) and then it would show up.

Either way I think Louis has a way now which is good.

Stan Mathews
05-01-2008, 10:09 AM
prior to opening it the genie was blank. Either way I think Louis has a way now which is good.True.
Additional note: Even if you can't see the file in the genie it doesn't mean you can't navigate to the directory where it resides and type in the filename.extension.

The developer is of course responsible for ensuring that such a procedure provides Alpha with a usable file.

05-01-2008, 01:55 PM
Using you test file I got the following in a few minutes with word.

record 2
record 3

this is broken with a linefeed and carriage return. i simply converted the spaces(2 or more) to commas and then eliminated the commas to 1 comma. It was fast and rude and could be better done. Just wanted to see what a record would look like from the file.

I have no conclusion except ease is not going to be the word of the day. It can be done though.

Stan Mathews
05-01-2008, 02:29 PM
i simply converted the spaces(2 or more) to commas and then eliminated the commas to 1 comma. It was fast and rude and could be better done. That approach would be fine but it seems there would be issues with fields that themselves contain spaces which would get converted to commas and fields that themselves contain commas.


05-01-2008, 02:44 PM
Correct Stan,

it would take a bit more work. but the idea that there were 6 spaces between fields kinda went haywire(or I read that wrong).
I think the idea was the inconsistency of the data and fields may be a problem. I am trying some other stuff.

If the records have 6 spaces between fields, it would look something like this. i inserted 0's for a field taking the place of 2 spaces.

P07000000010STEVEN GROSSBARD, P.A., 00,, 00, ADOMP, 00, 44 WEST FLAGLER STREET, 00,, 00, 2100, 00,, 00,, 00,, 00, MIAMI, 00,, 00, FL33130 US01012007, 00, N FL, 00,, 00,, 00,, 00, GROSSBARD, 00, STEVEN, 00,ESQ. P44 WEST FLAGLER STREET, 00,, 00, MIAMI, 00,, 00, FL33130 P PGROSSBARD, 00, STEVEN, 00, 44 WEST FLAGLER STREET, #2100, 00, MIAMI, 00,, 00, FL33130, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,, 00,

It would take about 8-10 passes to get this to uniform sizes using word. i think word can handle the file. Probably 1/2 day to get it good enough to import nicely. Or get close and let alpha handle it with some operations.

G Gabriel
05-01-2008, 04:13 PM

Louis Nickerson
05-01-2008, 04:55 PM
Stan, Doug, Mike, Ray, Dave and of course Gabe,

Thank you all for looking at this and offering your experience, observations, suggestions and examples.

I think I'm making some forward progress. I am now able to import the .dat file directly into Alpha 5. The formatting is off (a.k.a.) Alpha is not placing all of the data into the appropriate fields, but at least Apha can now see the data.

Gabe, you seem to have the data in the right fields (I'm assuming you used the Substr() function to extract the data into the correct fields. Whichever method you used, I would really appreciate it if you share the method you used. I will note, the file you did this with is just a sampling of the real data. the actual .dat file is in excess of 500 MB and there are 10 of these files.

Again thank you all for your time and effort!


G Gabriel
05-01-2008, 05:07 PM
The whole process took few seconds.. it's easier to do than explain, you could either:
1-zip all your files, I will import them and zip them back, or
2-I could attach a video showing how it's done.
Typing is not something I like to do..

Louis Nickerson
05-01-2008, 05:21 PM

In the interest of learning how to do this for myself, I would prefer the video. Thank you very much for offering.

If you would like to just show me live online, I have a LogMe In Rescue account we could use so I can observe what you are doing.

I am sending additional contact info in a private message.



G Gabriel
05-01-2008, 05:26 PM
I have a LogMe In Rescue account
Never used it before.. how do I log in?

Louis Nickerson
05-01-2008, 05:30 PM
Go to www.logmein123.com

Enter pin code 472392

Select Run

Select Run Again

Say Yes when it says that a technician wished to connect.

There is a built-in chat client we can use as well


Doug Page
05-01-2008, 05:32 PM

For the sake of this thread and the others like me who would like to see what you've done, could you post something here?


G Gabriel
05-01-2008, 05:37 PM
Definitely..I am just not sure if I can load the video.. I am working on it..give me a few.

G Gabriel
05-01-2008, 05:44 PM
In the attached video, I am showing how to create wk3 file from the DAT file. Beyond that, it's just a matter of importing the wk3 into alpha (didn't want to waste any video on that being a very simple straight forward matter).
Sorry.. the video will not upload being a little too big.. I will break it down into smaller ones.

here it is broken down into 3 smaller videos.

05-01-2008, 06:39 PM
Thanks G,

Maybe I'm just a bit too naive/ignorant but what is the advantage to doing what you did in excel compared to just using the Alpha import operation of an ascii file??--looked to be the same manipulation I used when I did it but maybe not as the video, at least for me, was not very clear (needed to be magnified I guess). Unless some fields maybe needed to be changed into a different type??

Doug Page
05-01-2008, 08:18 PM

I like the tricks you used, but I think that Mike is right - the result would be the same. For both methods I have some reservations as we are talking about a 550 MB file - 470,000+ rows of data!

G Gabriel
05-01-2008, 08:38 PM
I like the tricks you used, but I think that Mike is right - the result would be the same.
That's a negative.

ASCI will not work here. With ASCI, you need to tell alpha what is the delimiter. The delimiter here is a space, that won't work since the names, the companies' names and other fields all have spaces in them so if you use the space as a delimiter you will end up with a field for each word.

The way I did it is sort of "manually " put the delimiter where I want it hence you have the street address (which has several words) in one field, the company name (again with multiple words separated by spaces) all in one field and so on.

Do I have to explainthe joke? It is not funny anymore when you explain it!

Try to import it as ASCI and see wht happens.

05-01-2008, 08:54 PM
Hi Louis,

Do you realize that Alpha can read in and process any file, binary or text, as a blob (could be all at once, or portions at a time) and parse/process it. It won't normally be as fast as a built-in import primitive, but could be fast enough depending upon format.

Raymond Lyons
05-01-2008, 09:34 PM
That's a negative.

ASCI will not work here. With ASCI, you need to tell alpha what is the delimiter.

Wait a minute. Are you telling me that using the "Table format - the data in your file is in fixed width columns" option (no delimiter needed) won't work here? Seems to me the only reason it would not work is if the column widths are not fixed, either from one record to another or from one instance of the file to another, say next week. Usually these kinds of structures are set from the source over time unless a field is added or subtracted. At least that has been my experience and the ASCII table option in a saved operation works just fine.


05-01-2008, 09:56 PM
As Ray said. That is how I did it also--I imported into a new table using no delimiter and set all breaks manually just as what it appeared G did in Excel---hence my question as I could not tell what the difference was...is there any?? And if not then doesn't seem necessary to take the extra step with Excel...but can it be that simple???

17202 17204

G Gabriel
05-01-2008, 10:15 PM
Go to alpha, import using alpha, ASCI, fixed fields as you propose, and try to break the field that shows FL33130 (or try to break any other field if needed) into two fields one for the state, the other for zip.

Doug Page
05-01-2008, 10:43 PM
Gabe, that is easy in the Import Genie. It allows you to click on the data where you want to define the start/end of a fields data. Once you go through that part it just sucks in the data as you have identified.

Noticed that all the records that you have contain the cursed "00" in the last 4 characters of the record. Here is a really simple (read that no error checking) function that will convert the file without the special ascii chars as long as there are none elsewhere in the record.

FUNCTION AlterFLData AS C (FileIn AS C, FileOut AS C, RecLen AS N )
Dim FL_In as P
Dim FL_Out as P
Dim OutText as C
Dim FileSize as N
Dim RecsIn as N
Dim RecLoop as N
If file.exists(FileOut) then
end if
FileSize = File.size(FileIn)
RecsIn = FileSize/RecLen
FL_Out = File.Create(FileOut,FILE_RW_EXCLUSIVE)
For RecLoop = 1 to RecsIn
OutText = FL_In.read(RecLen)
OutText = Left(OutText,RecLen-6) + " " + crlf()
Next RecLoop

Just call this function like:
AlterFLData("C:\A5Apps\Florida\010207c.txt", "C:\A5Apps\Florida\010207c.alt",1172)

This should run through your file relatively quickly. Do use the Record Length of 1172 as it is 1170 chars of data and the CRLF. And as I am checking to see if the output file exists and if so delete it, please make the file names different. This should at least get you going for the 10 files you are getting.


Gabe, Attached dbf file to show all fields imported from ascii file into appropriate fields.

G Gabriel
05-02-2008, 01:10 AM
Maybe in your alpha it does, in mine it does not. If you use fixed fields and if you try to separate FL33130 into 2 fields, it does not work.

Raymond Lyons
05-02-2008, 03:53 AM

I tried it again after opening it first in Notepad and had no trouble separating FL from 33130 using v7 and v8. Could it be you expected the genie to do this automatically? One has to single click where you want the separation (double click on a separation to delete it, drag to move it).

Also, my previous post saying I used NoteTab Pro instead of Notepad was correct at that previous time but it turns out that for some reason NoteTab Pro does not handle the sample file properly. So this time I did use Notepad and it worked fine. I don't know what the file size limit might be for Notepad.


Bill Griffin
05-02-2008, 08:43 AM
Tried using the import genie and it worked fine as well. (V9) I have imported fixed field length tables before with V8 without a problem as well. Splitting the fields was simple and straightforward. Starting with the KISS method always seems to work best for me (as in this case as well). I try to investigate all simple options before jumping through hoops.

Doug Page
05-02-2008, 10:31 AM
NoteTab Pro will do the job but you have to set it to change the substitution character to a space. You'll find that in the options.

G Gabriel
05-02-2008, 11:10 AM
The field will only have two characters showing "FL", the rest will be discarded, will not be imported as a separate field. After you import, check what's in field_6 and field_7.

05-02-2008, 11:40 AM
I don't have a clue as to why you get different results than everyone else but...


Unless you simply did not add the necessary fields for the zip and country which is done via what Ray stated in post 36???

Bill Griffin
05-02-2008, 12:04 PM

You bring to mind the old monty python sketch,

Good Morning, I'd like to have an argument, please. :)

G Gabriel
05-02-2008, 12:28 PM


You bring to mind the old monty python sketch,

Good Morning, I'd like to have an argument, please.
Sounds like you are seening things from your own prism!
Sure hope you have a better explanation than that!
Perhaps 2 different versions of alpha?

Bill Griffin
05-02-2008, 12:30 PM
Looks Like Mike is using V8, I am using V9, not sure what others are using and since I am not sitting in front of your computer, I can't offer any explanations. Perhaps download the original txt file, open it in notepad and save it, and try it again?

G Gabriel
05-02-2008, 12:40 PM
Now you are sitting in front of my computer:

Raymond Lyons
05-02-2008, 12:45 PM

Watched how you did it. You are not doing it correctly. Up top where the marks are--click there to put in another vertical line. That's all there is to it.


G Gabriel
05-02-2008, 12:48 PM
I already tried that.. it does not do anything. I clicked and clicked untill the neigbors called the police.. no lines were added!

G Gabriel
05-02-2008, 12:51 PM
OK... never mind.. I skipped one step being used to wk3 were you could see these lines..I found it. The first field being too big, the lines were out of sight.

Bill Griffin
05-02-2008, 01:01 PM
Now that I can see what you are doing, I do have a better explanation. You are decreasing the size of field 6 to from 14 to 2 characters. You are not following the genies directions on how to split a fixed length. The method you are using just takes a best guess as to where the fields are. You are redefining the length manually, therefore dropping the other 12 characters. New fields are not just automatically created this way. Start over and read the directions on the first screen about splitting the fixed length record

Raymond Lyons
05-02-2008, 01:04 PM

Yes, I watched your video again and this time saw that you went right past the place where one puts in the separation lines. Prior to this I was about to call your police department and have them bop you up side the head! (just kidding of course!)


Raymond Lyons
05-02-2008, 04:11 PM
NoteTab Pro will do the job but you have to set it to change the substitution character to a space. You'll find that in the options.

Doug, thanks for the above. Somehow I knew NoteTab Pro would not let me down like that. Now I can change the binary substitution character to anything I might want, not just a space.