Reading text files using read.table


I have a text file with an id and name column, and I'm trying to read it into a data frame in R:

d = read.table("foobar.txt", sep="\t")

But for some reason, a lot of lines get merged -- e.g., in row 500 of my data frame, I'll see something like

row 500: 500 Bob\n501\tChris\n502\tGrace

[So if my original text file has, say, 5000 lines, the dimensions of my table will only end up being 1000 rows and 2 columns.]

I've had this happen to me quite a few times. Does anyone know what the problem is, or how to fix it?

2/21/2018 4:16:41 AM

From ?read.table: The number of data columns is determined by looking at the first five lines of input (or the whole file if it has less than five lines), or from the length of col.names if it is specified and is longer. This could conceivably be wrong if fill or blank.lines.skip are true, so specify col.names if necessary.

So, perhaps your data file isn't clean. Being more specific will help the data import:

d = read.table("foobar.txt", 
               col.names=c("id", "name"), 

will specify exact columns and fill=FALSE will force a two column data frame.

9/12/2009 9:56:45 PM

Licensed under: CC-BY-SA with attribution
Not affiliated with: Stack Overflow