<p>I am trying to build a system with multiple scheduled API fetch calls. A quick google search gives me <a href="https://github.com/RichardKnop/machinery" rel="nofollow">https://github.com/RichardKnop/machinery</a>, has anyone used this before?</p>
<hr/>**评论:**<br/><br/>bkeroack: <pre><p>There tends to be less need for something like this in the Go world (vs Python, Ruby, etc) because it's really easy to do asynchronous actions in-process with goroutines.</p>
<p>That said, there's <a href="http://nsq.io" rel="nofollow">NSQ</a> which I've used in the past and is nice and lightweight. Kafka is another (less lightweight) option. Neither is as high-level or abstracted as Celery, however.</p></pre>daniels0xff: <pre><blockquote>
<p>There tends to be less need for something like this in the Go world (vs Python, Ruby, etc) because it's really easy to do asynchronous actions in-process with goroutines.</p>
</blockquote>
<p>This is not necessarily true. Sure you can do simple stuff in a goroutine, but it's no were near something like Celery.
How do you track the progress of your task? What if it's a long running expensive task? If you don't queue them you end up overloading the server (assuming each request from browser triggers a background task). same if you have a ton of small tasks, if you don't queue them you endup overloading the server.</p>
<p>For a webapp if you process something in a goroutine instead of dedicated worker again you might overload the server and prevent it from being performant at what it should do - serve http requests.</p>
<p>What if the app crashes? Or you restart the server? How do you guarantee persistence of your tasks? That the workers will just resume their work without loosing the tasks in the queue.</p>
<p>Celery does all this for you out of the box, with just a "pip install celery"; It takes about 5 minutes to have Celery up and running on your dev env and with a hello world task that you can play with.</p></pre>fancy_pantser: <pre><p>Machinery is probably what you want: <a href="https://github.com/RichardKnop/machinery" rel="nofollow">https://github.com/RichardKnop/machinery</a></p>
<p>For more: <a href="https://awesome-go.com/#messaging" rel="nofollow">https://awesome-go.com/#messaging</a></p></pre>redditbanditking: <pre><p>I haven't really used machinery, but a quick look of it makes me wonder if its complexity is at all needed for distributed tasks.</p>
<p>If you are looking for distributing async jobs in the same process/cpu/server, you use channels and goroutines. If you are looking for something more scalable to multiple nodes and servers, then RMQ, redis, SQS does the job as the queue backend.</p></pre>daniels0xff: <pre><p>I'd be interested in this too. You could use things like Gearman, RabbitMQ, etc. directly with GO and get something working but something like Celery would be nice. Celery is really awesome if you work in Python.</p></pre>mcouturier: <pre><p>If there's a good Go client library why would you care if the server is written in Go or not? Will you fork and contribute?</p></pre>jney: <pre><p>Uber's cherami intend to replace Celery in their own architecture <a href="https://eng.uber.com/cherami/" rel="nofollow">https://eng.uber.com/cherami/</a></p></pre>
这是一个分享于 的资源,其中的信息可能已经有所发展或是发生改变。
入群交流(和以上内容无关):加入Go大咖交流群,或添加微信:liuxiaoyan-s 备注:入群;或加QQ群:692541889
- 请尽量让自己的回复能够对别人有帮助
- 支持 Markdown 格式, **粗体**、~~删除线~~、
`单行代码`
- 支持 @ 本站用户;支持表情(输入 : 提示),见 Emoji cheat sheet
- 图片支持拖拽、截图粘贴等方式上传