<p>I have a Go program that is meant for opening a file and decoding RLE images, and then sending the arrays of pixel data back to Node. Node and the Go app are on the same machine. </p>
<p>First of all, if anyone has any recommendations on how to best do this, please let me know. I looked into Node's IPC option when creating the child process, but it doesn't appear Go supports that unless I'm mistaken. </p>
<p>Right now I am attempting to use Websockets using Node's websocket module and gorilla for my Go program. </p>
<p>However what I'm seeing is, not all the messages are getting received by the Node server. Sometimes no messages at all come through, sometimes 1 or 2 but not the rest. I'm not well versed in network/socket stuff and after googling all I've found are semi-advanced examples that assume I'm serving up web pages or building a chat client, so I may be missing something fundamental. </p>
<p>All I'm doing is reading bytes, decoding those bytes, and sending an array of ints back to Node. The decoding happens in a goroutine, so I can keep looping though the file and getting all the images and decode them concurrently. </p>
<p>Here is a very simplified version of my Go program right now:</p>
<pre><code>var conn *websocket.Conn
var wg sync.WaitGroup
func main() {
conn, _, _ = websocket.DefaultDialer.Dial("ws://localhost:1234", nil)
f, _ := os.Open(filePath)
defer closeFile(f)
r := bufio.NewReader(f)
for *not end of file* {
//get size of RLE block to decode
var size int16
binary.Read(r, binary.BigEndian, &size)
//read 'size' bytes, send to goroutine for decode
data := make([]byte, size)
_, _ = io.ReadFull(r, data)
wg.Add(1)
go decodeRLE(data)
}
wg.Wait()
}
func decodeRLE(data []byte) {
defer wg.Done()
*decode the bytes into a slice of uint8*
imgJSON, _ := json.Marshal(img)
conn.WriteMessage(websocket.TextMessage, imgJSON)
}
</code></pre>
<hr/>**评论:**<br/><br/>Snabel3: <pre><p>Why not just printing the result from the go application and execute the program from node and use the output? Way simpler, fast and bulletproof.</p>
<p><a href="https://nodejs.org/api/child_process.html#child_process_child_process_execfile_file_args_options_callback" rel="nofollow">https://nodejs.org/api/child_process.html#child_process_child_process_execfile_file_args_options_callback</a></p></pre>solarnoise: <pre><p>That's a good idea, I previously used Child_Process.spawn but was finding stdout to be too slow, or was getting fragmented.</p>
<p>I just tried execFile and am not sure how to get this to work for my needs. First of all, it seems all of stdout gets sent as one big dump instead of the callback triggering for each individual fmt.Printf, so I would need to make sure all the arrays are getting sent in a way that I can easily parse.</p>
<p>The bigger problem though is that the size of stuff getting sent is HUGE. I'm building arrays that are hundreds of thousands of bytes. I cranked the "maxBuffer" number way up on the execFile options, but still find that I'm not getting all the data sent over. Not sure if it's the buffer size or some other hitch in the process.</p>
<p>Right now I am doing this in the execFile handler:</p>
<pre><code>var o = stdout;
console.log(o.length);
</code></pre>
<p>And I get a different value every time I re-run the execFile. I think this may be because since I'm decoding the images concurrently, the order in which they finish can change, and at some point the max buffer # will be hit.</p>
<p>I suppose I can keep increasing the buffer #, but then I'm just guessing that it's a large enough number that my users will never run into a bug where they're loading a file that's even bigger than I was prepared for. Hope this makes sense.</p>
<p>If it sounds like I'm missing something (I haven't tried using the Buffer output yet), please let me know.</p></pre>epiris: <pre><p>Okay, so I have to ask a couple questions before I give a more thoughtful response in vein.</p>
<p>First .. long shot here, but are you using lossy RLE? You say you are using RLE images.. on top of getting some, but not all data.. leads me to think maybe you are using lossy encoded files? Worth a quick sanity check.</p>
<p>Second, your general application flow makes no sense to me, the web sockets and everything. Since you are just sending JSON data, make a very thing golang service that listens over HTTP and returns JSON. Then write tests against your implementation without the nodejs and web sockets transport obfuscating where your problem may be. Don't test system integration points at a system level before you have vetted them at the component level.</p>
<p>That all said there is a few places in the code you posted for failure in all sorts of ways. Maybe you snipped stuff out for brevity but if your wanting help finding a bug the code your not actually having trouble with provides no value.</p>
<p>So, you are not checking errors. That's wrong. But you have a fatal error in your applications logic. Inside of a loop which must decode a file in serial.. you are launching Go routines to do the work in parallel. So more demanding decode chunks are going to cause overlap on short ones.</p>
<p>Tldr race condition from concurrency that provides no benefit.</p></pre>epiris: <pre><p>To be more clear, go decodeLRE should be decodeLRE(). If you want to try to leverage multiple cores do it on a per file process if possible. If you still want to decode across multiple go routines you need to ensure serial responses through other means. I.e. don't send just a []byte to decodeRLE, include a channel. Each call to decodeRLE create a channel to send it for returning the decode result. You can use any LIFO structure for this. Then send that result from the main thread... to ensure serial responses.</p></pre>solarnoise: <pre><p>Hey thanks for the reply.</p>
<p>I would agree and definitely argue that most of this tool does not make sense as it's going to be a Photoshop plugin and their entire ecosystem is just crazy. Most of the plugin is run in CEF (embedded Chromium) as an HTML/JavaScript app, but also has access to Node.js for utility.</p>
<p>What I found was, parsing these large files and decoding the images was far too slow and locks up the whole tool, even when using Web Workers. It just wasn't fast enough. So I had it in my head to offload the file reading/decoding part and so far Go has mostly worked for that.</p>
<p>Regarding the encoding of the images: it's a very dead simple encoding where any repeating pixels are stored as runs, eg. a sequence of 0000 gets encoded as 40 (4 zeroes).</p>
<p>Regarding error checking: I do a check for every error in the actual file, I left it out just to keep the sample pseudo code short and sweet.</p>
<p>The whole Go app runs perfectly, concurrency and all, when run on its own. It's only when I attempt to pass these decoded arrays back that I run into trouble.</p>
<p>I tried using Node's Child_Process.spawn and attaching a stdout handler, but it seems far too slow.</p>
<p>I then tried the websocket route, but don't always get all of the messages sent as I described in the OP. This very well could be a race condition thing, or the connection closing before all the messages could be sent... no idea, hence why I asked here in case there's an obvious step I'm missing =)</p>
<p>But I can assure you the sample code I put up really is the bulk of the program, I just stripped out the RLE decode part as it's really just a loop of getting bytes and adding them to the array.</p></pre>
这是一个分享于 的资源,其中的信息可能已经有所发展或是发生改变。
入群交流(和以上内容无关):加入Go大咖交流群,或添加微信:liuxiaoyan-s 备注:入群;或加QQ群:692541889
- 请尽量让自己的回复能够对别人有帮助
- 支持 Markdown 格式, **粗体**、~~删除线~~、
`单行代码`
- 支持 @ 本站用户;支持表情(输入 : 提示),见 Emoji cheat sheet
- 图片支持拖拽、截图粘贴等方式上传