Ok the problem I had was that the csv file is being uploaded, I am lucky that the file will always be pretty much the same 126 columns and I only needed to process/query 15 of those. SO (for search engine status) How to skip fields while using LOAD DATA INFILE. Those of you trying this, check with your host provider, as is the case with mine I had to add the line LOCAL to the statement, hence
Code:
$sqlstatement="LOAD DATA LOCAL INFILE
How to skip fields on input using LOAD DATA LOCAL INFILE.
Code:
LOAD DATA [LOW_PRIORITY | CONCURRENT] [LOCAL] INFILE 'file_name'
[REPLACE | IGNORE]
INTO TABLE tbl_name
[CHARACTER SET charset_name]
[{FIELDS | COLUMNS}
[TERMINATED BY 'string']
[[OPTIONALLY] ENCLOSED BY 'char']
[ESCAPED BY 'char']
]
[LINES
[STARTING BY 'string']
[TERMINATED BY 'string']
]
[IGNORE number LINES]
[(col_name_or_user_var,...)]
[SET col_name = expr,...]
By default, when no column list is provided at the end of the LOAD DATA INFILE statement, input lines are expected to contain a field for each table column. If you want to load only some of a table's columns, specify a column list. (dev.mysql)
Code:
LOAD DATA INFILE 'persondata.txt' INTO TABLE persondata (col1,col2,...);
You must also specify a column list if the order of the fields in the input file differs from the order of the columns in the table. Otherwise, MySQL cannot tell how to match input fields with table columns.
The column list can contain either column names or user variables. With user variables, the SET clause enables you to perform transformations on their values before assigning the result to columns.
User variables in the SET clause can be used in several ways. The following example uses the first input column directly for the value of t1.column1, and assigns the second input column to a user variable that is subjected to a division operation before being used for the value of t1.column2:
Code:
LOAD DATA INFILE 'file.txt'
INTO TABLE t1
(column1, @var1)
SET column2 = @var1/100;
you can discard value on input by assigning a user variable and not assigning the variable to a table column. By using @dummy as a vairiable it essentially skips over the field and leaving it NULL. Way faster. I have a lot of fields so in order to keep track I named them all COLxx My code looks something like this:
Code:
$sqlstatement="LOAD DATA LOCAL INFILE '$temp' INTO TABLE myuploadtable FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES (@COL1, @COL2, @COL3, @COL4, @COL5, @COL6, @COL7, @COL8, @COL9, @COL10, @COL11, @COL12, @COL13, myfield1, myfield2, myfield3, @COL17, myfield4, myfield5, myfield6, myfield7, @COL22, @COL23, @COL24, myfield8, myfield9,..... )";
mysql_query($sqlstatement) or die(mysql_error());
echo "It worked";
This worked out ok, I have not checked security yet but this has been tested.
Bookmarks