<p>Right now I have a python script that parses filenames in about 90000 folders, they are in date folders so there is one 2015, under that there are 01 to 12 and under each of the 12 there can be one per day and then under existing day folders there are time folders...</p>
<p>If I wanted to do this faster, would a good option (using go) be to start a goroutine for each month dir i find so that goroutine looks through that part of the tree so covering 2 years I get a maximum of 24 goroutines parsing their month in paralell?</p>
<p>Not sure if that is relevant but for further info the machine i would run it on has 12 cores and 128 gb memory. Not sure of the speed of the disk but it looks good.</p>
<hr/>**评论:**<br/><br/>schumacherfm: <pre><p><a href="https://github.com/MichaelTJones/walk" rel="nofollow">https://github.com/MichaelTJones/walk</a></p>
<p>Fast parallel version of golang filepath.Walk()</p></pre>dlsniper: <pre><p>If you can saturate the disk with goroutines, do it. You have to test out on your hardware what's the best way to move forward.</p></pre>tty5: <pre><p>On HDD it might be slower, because you have to move head for each read. On SSD it should be significantly faster.</p></pre>
这是一个分享于 的资源,其中的信息可能已经有所发展或是发生改变。
入群交流(和以上内容无关):加入Go大咖交流群,或添加微信:liuxiaoyan-s 备注:入群;或加QQ群:692541889
- 请尽量让自己的回复能够对别人有帮助
- 支持 Markdown 格式, **粗体**、~~删除线~~、
`单行代码`
- 支持 @ 本站用户;支持表情(输入 : 提示),见 Emoji cheat sheet
- 图片支持拖拽、截图粘贴等方式上传