This question came up on [Macintouch](http://www.macintouch.com/readerreports/filemaker9/index.html):
>Can you import an ASCII (.txt) file into Filemaker Pro 9 that uses a dollar sign ($) as a field delimiter within a record? The file is too big to bring into Excel first (on the way to filemaker).
Unfortunately, FileMaker 9 does not change the supported import formats, and $ delimited is *not* on the list. On the bright side, there are several ways to deal with this sort of thing in FileMaker.
## Import and Parse
You can go ahead and import the file. FileMaker will probably see each line in the file as one field value. Create a temporary text field in your table, then import the file, mapping its one line to the new field. Using a looping script, go through each record, parse the incoming text, and move it to your real fields:
Go to Record/Request/Page [First]
Set Field [My Field, GetValue(Substitute(TEMP, “$”, “Â¶”), 3)]
# add as many Set Field steps as you need
Go to Record/Request/Page [Next, Exit after Last]
In the above code snippet, we use this calculation to parse the incoming text:
GetValue(Substitute(TEMP, “$”, “Â¶”), 3)
It grabs the *third* value from the $-delimited file. Here’s how it works:
First, it turns every “$” into a newline. So instead of this:
You now have this:
Then, we call upon the `GetValue` function to pull one line out of this multi-line bit of text.
## Pre-clean the Data
If you prefer, you can clean the file up before import. Many text editors have powerful search and replace functions, and can handle files that bring Excel to its knees.
On Mac OS X, if you’re moderately comfortable on the command line, you can do this:
tr $ ‘\t’ < yourfile.txt > thecleanfile.txt
This command will [tr]anslate every “$” into a tab, and put the new cleaned-up data in a file called `thecleanfile.txt`. You can then import the file as a tab-delimited file.
>Note: If the original file contains tabs, they will get in the way, so you need to convert them to something else (like spaces) first.
Using these techniques, you can bring many atypical data formats into FileMaker, where they’re much easier to work with.
4 thoughts on “Importing Non-Standard Text Files”
The TR command line sample had a typo. It was putting a Mac OS style line ending in place of $, instead of a tab. I just fixed it. Sorry ’bout that 🙂
Thanks for the tip.
I have a similar problem a little more complicated. I am importing whole folders of text files. Those text files have lines that I want converted to records. I want to accomplish more or less the same thing, and that is, to import the content of the text file in a TEMP field and later parse it.
The only problem is that the content of that TEMP field has to be converted to N number of records and then the data (a couple of fields only) parsed to fields.
How do I start to parse the lines that are going to be records (some lines are junk) and then parse the fields?
Text parsing is quite challenging in filemaker. I can’t go into detail about exactly how you would do it because it can be quite different depending on the data that you want to parse. I might be able to at least point you in the right direction. Here are a couple questions for you:
What type of delimiter does the file use?
Are the lines you would like to dispose of uniform?
Great comment. Really helped me with a large file (just under 1 million records) that was pipe-delimited. That would have been 15 million find/replace operations. Wow — the Terminal tr command is virtually instantaneous!
However, the syntax needed to be a little different, it seems, at least on my OS X 10.5.8 machine. To go from pipes to commas, the command looks like this:
tr “|” “,” thecleanfile.txt
As a Unix newbie, I didn’t know that I needed double-quotes around the strings. And I didn’t know that the “” are needed to specify the input and output files respectively. I thought they were the typical sort of bracketing used by folks elsewhere to indicate “install your own text here” so I omitted them. Oops.
Just thought I’d mention that, since this page seems to be directed to folks who, like me, are not Terminal experts.