-
Notifications
You must be signed in to change notification settings - Fork 103
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can I use SQL Bulk Copy but exclude rows that don't met a certain set of criteria? #55
Comments
@VictorioBerra Not easily no, since the ColumnHeader functionality is more about mapping CSV columns to SQL columns. What I think would be needed is to extend CsvReader with a Func<Row, bool> delegate that would be evaluated by NextRecord, so it would only return true if the row is to be returned. |
@phatcher thanks for the quick reply. I will see what I can do. |
I put my Func evaluate code here but im having some trouble "skipping" the current row depending on if it was true/false if(Filter != null)
{
var shouldKeepRecord = Filter.Invoke(this);
if (!shouldKeepRecord)
{
// Causes a row to be skipped?
SkipToNewLine(ref _nextFieldStart);
}
} I tried SkipToNewLine() but that just skipped the line after and the very last line. I am using the reader like this currently: foreach (var fileName in files)
{
Logger.Info(string.Format("Reading in File: {0}", fileName));
using (var reader = new CsvReader(new StreamReader(fileName), true, (fields) => {
var someNumberField = fields[0];
if (someNumberField == 10000)
{
// Return true to keep a row
return true;
}
//return false to discard this row
return false;
}))
{
using (var outputStream = new StreamWriter(@"C:\Users\toryb\Downloads\CsvReader Output\output.txt"))
{
while (reader.ReadNextRecord())
{
var sb = new StringBuilder();
// ... removed for brevity ...
outputStream.WriteLine(sb.ToString());
}
}
} |
So I think maybe this works when placed after this line?
|
Lets say using the example here I did not want to include a row if the Column
OpenPrice
was< 100.00
.Is this possible?
The text was updated successfully, but these errors were encountered: