-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fails to detect delimiter #45
Comments
As you can see, the "detect" function is pretty basic, https://github.com/Inist-CNRS/node-csv-string/blob/master/src/CSV.ts#L57-L65 comma is the default value ! |
I understand that, however shouldn't it change if it detects a different delimiter (tab in my case)? |
Hello, I noticed that problem too. I tried to detect the best separator in that test string using the detect function : Then I made another test : So I made a last test : Since the issue has been notified, do you plan to fix that or do you consider it as a correct behaviour ? Thank you EDIT : The function defines the "best" delimiter as the first character found in the input which is also included in the array of accepted separators without considering the escaped content. So for now I created my own detect function which use the same array of allowed delimiter except that I do not consider matching separators within a string escaped either by double or single quotes and it's not the first encounter but the most recurring one that is returned. Let me know if you're interested in my function, I'd be glad to share it with you. Have a good day ! |
@pbrunisholz Thank you for spending time in this. Your suggestion of the As you already implemented it, why don't you create a pull request with the changes? if you like! ... perhaps saving some time. Also it might be easier to discuss in a PR. |
I'm trying to use the
detect
function to determine the delimiter used for data which can be either tab or comma delimited, however it always returns the delimiter as,
even if there are no commas in the data.The text was updated successfully, but these errors were encountered: