Exporting from database to CSV

xuanbao · · 817 次点击    
这是一个分享于 的资源,其中的信息可能已经有所发展或是发生改变。
<p>Hello,</p> <p>I am looking to add an endpoint to an API I have written in Go that exports database search results into a CSV and provides a download link.</p> <p>I have been looking into this but my main issue is that I will be writing up to 30,000 rows at a time and I can not seem to find examples of how anybody has done this efficiently. I am quite new to Go so may be missing something.</p> <p>Can anybody point me in the right direction and highlight some considerations please?</p> <hr/>**评论:**<br/><br/>jsilvela: <pre><p>I don&#39;t have code to share for this, and don&#39;t know what DB you have, but I would suggest that you look into using a cursor so you don&#39;t need to do all 30k rows in one go.</p> <p>Eg. <a href="http://www.postgresql.org/docs/9.2/static/plpgsql-cursors.html" rel="nofollow">http://www.postgresql.org/docs/9.2/static/plpgsql-cursors.html</a></p></pre>StabbyCutyou: <pre><p>I would advise against using cursors. They usually come with a lot of performance penalties and other issues, and are a bit of a crutch.</p> <p>Especially in this case, since the go SQL library is already built to handle large datasets by streaming the results as they&#39;re read, and not forcing them all into memory at once.</p> <p>For more reading on why you should probably just forget they even exist:</p> <p><a href="http://www.datavail.com/category-blog/sql-server/curse-of-the-cursors-in-code-development/" rel="nofollow">http://www.datavail.com/category-blog/sql-server/curse-of-the-cursors-in-code-development/</a> <a href="http://stackoverflow.com/questions/58141/why-is-it-considered-bad-practice-to-use-cursors-in-sql-server" rel="nofollow">http://stackoverflow.com/questions/58141/why-is-it-considered-bad-practice-to-use-cursors-in-sql-server</a></p> <p>In the interest of fairness, here is someone who has good things to say, but also is saying some incorrect or misleading things as well, as evidenced by the comments in the top response where the person admits the information is second hand.</p> <p><a href="http://stackoverflow.com/questions/287445/why-do-people-hate-sql-cursors-so-much" rel="nofollow">http://stackoverflow.com/questions/287445/why-do-people-hate-sql-cursors-so-much</a></p> <p>So, for the sake of any DBA who might be administering the servers your code runs on - try to forget they exist.</p></pre>nicerobot: <pre><p>First, I can not express enough that everyone should stop using CSV and instead use TSV. TSV is a far more simplistic format, as easy to creat and far easier to read (as long as your data doesn&#39;t contain tabs). Naw, with that out of the way, please see <a href="https://github.com/sqlrest/sql.rest" rel="nofollow">https://github.com/sqlrest/sql.rest</a> as an example of how to export from PostGreSQL.</p></pre>StabbyCutyou: <pre><p>I wrote a utility in go to basically do this. It&#39;s a binary, not a library, but feel free to crib any code that might be helpful.</p> <p><a href="https://github.com/StabbyCutyou/sqltocsv/blob/master/sqltocsv.go" rel="nofollow">https://github.com/StabbyCutyou/sqltocsv/blob/master/sqltocsv.go</a></p> <p>Please do not use cursors for this - the go database library was written with large datasets in mind, and will not load the entire result set into memory. Cursors are designed as a bit of a crutch (imo) and database performance suffers because of them.</p> <p>The example code uses TSV (despite what the library name is :) ) but you can easily adapt the approach for CSV.</p> <p>The library is in a bit of a half-finished state, but I currently use this to handle extremely large (several gigabyte) backups, so it should be a fine approach for things that top out near 30k results.</p></pre>joshuaprunier: <pre><p>Here is a small cross platform MySQL query to csv command line tool I created for use at my company. MySQL outfiles are normally generated on the database server file system which requires sudo access or an nfs mount for users to properly retrieve their csv file. This program gives internal users the ability to securely generate outfiles directly from their laptops or automate csv query reports via cron, etc.</p> <p><a href="https://github.com/joshuaprunier/mycsv" rel="nofollow">https://github.com/joshuaprunier/mycsv</a></p> <p>It&#39;s definitely MySQL specific. I had to customize the std lib csv package to produce output identical to what MySQL does. The channel communication/sync&#39;ing is probably overkill but the reading/writing sql.RawBytes is pretty much what you have to do when you don&#39;t know what results your query will return. </p> <p>Hopefully this helps a little even if you aren&#39;t using MySQL. Best of luck!</p></pre>

入群交流(和以上内容无关):加入Go大咖交流群,或添加微信:liuxiaoyan-s 备注:入群;或加QQ群:692541889

817 次点击  
加入收藏 微博
暂无回复
添加一条新回复 (您需要 登录 后才能回复 没有账号 ?)
  • 请尽量让自己的回复能够对别人有帮助
  • 支持 Markdown 格式, **粗体**、~~删除线~~、`单行代码`
  • 支持 @ 本站用户;支持表情(输入 : 提示),见 Emoji cheat sheet
  • 图片支持拖拽、截图粘贴等方式上传