json.Decoder vs json.Unmarshal

xuanbao · · 897 次点击    
这是一个分享于 的资源,其中的信息可能已经有所发展或是发生改变。
<p>I&#39;m looking to convert an json object to map type, but I&#39;m not that sure what method offers the best performance. So far I&#39;ve got it working with <code>json.Unmarshal</code>, but I&#39;m retrieving the json object from a psql database as a string (I might store the objects as bytea in the db since it requires []byte) and been told that <code>json.Decoder</code> would be better. I&#39;m pretty confused which one I should use and why or to choose a faster package than standard <code>encoding/json</code> ..</p> <p>EDIT: <a href="https://github.com/a8m/djson">https://github.com/a8m/djson</a> works like a charm!</p> <hr/>**评论:**<br/><br/>divoxx: <pre><p>Unmarshal operates on a []byte, meaning that it needs the JSON to be fully loaded in memory. If you already have the full JSON stored in a variable this will likely be a bit faster.</p> <p>Decoder operates over a stream, any object that implements the io.Reader interface. Meaning that you can parse the JSON as its being received/transmitted without having to fully store it in memory. This is useful when dealing with a large dataset by not requiring you to create an extra copy of the entire JSON content in memory.</p> <p>Honestly, they are both fast enough for most cases and I&#39;d recommend sticking to the standard library unless you have a good reason to use another JSON library.</p></pre>weberc2: <pre><p>This is wrong last I checked. Decoder is for streams of JSON objects, but each object is fully loaded into memory (I think it actually calls Unmarshal() under the covers). This is a common misconception in my experience.</p></pre>divoxx: <pre><p>Actually, you&#39;re right.</p> <p>It buffers internally the content string for a single object, but it discards that once it reaches an end of value that can be decoded into the provided struct. It does not load the entire string with the many objects in memory but it does hold the entire JSON for a single object in a buffer.</p></pre>JackOhBlades: <pre><p>So the question is: if you have a stream of very large objects how do you consume them piecemeal without loading a giant (say 1GB) object into memory?</p></pre>SilentWeaponQuietWar: <pre><p>I haven&#39;t been keeping up to date on the various json packages out there, but last time I looked into it, this jsonparser package was much faster than the standard packages</p> <p><a href="https://github.com/buger/jsonparser" rel="nofollow">https://github.com/buger/jsonparser</a></p> <p>I&#39;m pretty sure I&#39;ve seen some more come out in the past few months that claim to be even more performant too.</p> <p>edit: benchmarks: <a href="https://github.com/buger/jsonparser#benchmarks" rel="nofollow">https://github.com/buger/jsonparser#benchmarks</a></p></pre>danredux: <pre><blockquote> <p><a href="https://github.com/buger/jsonparser">https://github.com/buger/jsonparser</a></p> </blockquote> <p><a href="https://github.com/buger/jsonparser/issues">https://github.com/buger/jsonparser/issues</a></p> <p>Seems like many payloads get read incorrectly, so if correctness is more important than speed it may not be worth it.</p></pre>itsmontoya: <pre><p>Also, the benchmarks are tailor made for that library. I wrote a library which outperforms it.</p></pre>shark1337: <pre><p>Looking into the godoc of this package, it doesn&#39;t seem to provide conversion only parsing..</p></pre>_a8m_: <pre><p>You should checkout djson: <a href="https://github.com/a8m/djson" rel="nofollow">https://github.com/a8m/djson</a>.<br/> I created this project couple months ago, because I wanted a decoder that will work with <code>interface{}</code>/<code>map[string]interface{}</code>, but also with good performance.<br/> you can read the motivation section in the readme to know more about my use case.</p> <p>I also opened an issue in <code>burger/jsonparser</code> repo to add <code>djson</code> as a compared library <a href="https://github.com/buger/jsonparser/issues/75" rel="nofollow">https://github.com/buger/jsonparser/issues/75</a> and the author asked for a PR.<br/> I created a PR, but it seems like he too busy to handle it. link: <a href="https://github.com/buger/jsonparser/pull/78" rel="nofollow">https://github.com/buger/jsonparser/pull/78</a></p> <p>Hope you enjoy that. feel free to share your thoughts. </p></pre>shark1337: <pre><p>It&#39;s truely amazing! Works like a charm and it had a nice api as well, gonna pin it in the post, ty for sharing it with me!</p></pre>caseynashvegas: <pre><p>Out of curiosity, what made you create a whole new open source package instead of being a go contributor and improving the standard library? I&#39;m just always curious when I see packages that popup that have the same basic purpose as something in the core library. </p></pre>hobbified: <pre><p>If it&#39;s possible to prove a concept outside of core, why <em>wouldn&#39;t</em> you do that before putting it in core and subjecting it to all of the hassle and inflexibility you have to deal with there?</p></pre>caseynashvegas: <pre><p>I see your point, but why should it be viewed as a hassle? Also to prove the concept you could start a fork, right? </p></pre>rigadoo: <pre><p>As a consumer of a library, it&#39;s much easier for me to just import the library than switch to a forked version of go. In fact, importing a library off of github is really no harder than importing something from stdlib.</p></pre>bkeroack: <pre><p>I typically downvote &#34;here use my library&#34; answers to beginner questions. </p> <p>The correct answer is: if you have an io.Reader, use Decoder. Otherwise use Unmarshal.</p></pre>

入群交流(和以上内容无关):加入Go大咖交流群,或添加微信:liuxiaoyan-s 备注:入群;或加QQ群:692541889

897 次点击  
加入收藏 微博
暂无回复
添加一条新回复 (您需要 登录 后才能回复 没有账号 ?)
  • 请尽量让自己的回复能够对别人有帮助
  • 支持 Markdown 格式, **粗体**、~~删除线~~、`单行代码`
  • 支持 @ 本站用户;支持表情(输入 : 提示),见 Emoji cheat sheet
  • 图片支持拖拽、截图粘贴等方式上传