XLS and CSV files

agolangf · · 691 次点击    
这是一个分享于 的资源,其中的信息可能已经有所发展或是发生改变。
<p>Hi Reddit! </p> <p>We posted on this subreddit earlier this week about web scraping, and it was very helpful. Our task is to gather multiple csv and xls files and combine them into a master list. We are able to download the files from their URLs, but we do not know how to combine them all into one file. Our knowledge of Go is limited, and for this reason, we appreciate any help we can get.</p> <p>Thanks!</p> <hr/>**评论:**<br/><br/>Bromlife: <pre><p>CSV: <a href="https://golang.org/pkg/encoding/csv/" rel="nofollow">https://golang.org/pkg/encoding/csv/</a></p> <p>XLS: <a href="https://github.com/tealeg/xlsx" rel="nofollow">https://github.com/tealeg/xlsx</a></p></pre>ItsNotMineISwear: <pre><p><code>encoding/csv</code> is pretty rough though. You have to take into account the header yourself and everything! I can&#39;t find any more abstracted CSV libraries though.</p> <p>Despite how &#34;meh&#34; the format is in some ways, I do like using CSVs over JSON in some situations.</p></pre>barsonme: <pre><p>The header doesn&#39;t seem like that big of a deal.</p> <p>Simply</p> <pre><code>header, err := r.Read() if err != nil { ... } // read rest of file </code></pre> <p>Or</p> <pre><code>rows, err := r.ReadAll() if err != nil { ... } // not sure if ReadAll will return and empty slice // assuming it doesn&#39;t header := rows[0] rows = rows[1:] </code></pre></pre>steakholder69420: <pre><p>Thanks everyone for helping us out! We got it working now! </p></pre>picapiggy: <pre><p>It might help to be a little more specific. Have you figured out how to parse the CSV and XLS files? How do you want to combine them? What do you want to output?</p></pre>jamra06: <pre><p>Make a go routine that listens on a channel. The channel accepts a struct that defines the data you will add or update on your central list. </p> <p>If you want to download from all sources at the same time or if the program runs on some kind of schedule, make a go routine for each web scraper. The web scraper downloads the data and iterates through each row, assembling the struct and pushed it into the channel.</p></pre>

入群交流(和以上内容无关):加入Go大咖交流群,或添加微信:liuxiaoyan-s 备注:入群;或加QQ群:692541889

691 次点击  
加入收藏 微博
暂无回复
添加一条新回复 (您需要 登录 后才能回复 没有账号 ?)
  • 请尽量让自己的回复能够对别人有帮助
  • 支持 Markdown 格式, **粗体**、~~删除线~~、`单行代码`
  • 支持 @ 本站用户;支持表情(输入 : 提示),见 Emoji cheat sheet
  • 图片支持拖拽、截图粘贴等方式上传