<p>I want to build a test helper that takes a SQL query and some expected rows and asserts that they're equal: <code>func test(query string, wanted [][]interface{}) error</code>; however, the SQL interface expects pointers to values, so I'm pretty sure that for each row in <code>wanted</code>, I need to create an <code>[]interface{}</code> where the values are of the same type as the values in <code>wanted</code> (these are the values I need to compare against <code>wantedRow</code>) and from there I need to create an <code>[]interface{}</code> where each element is a pointer into the corresponding element of the value slice. Something like:</p>
<pre><code>values := make([]interface{}, len(wantedRow))
pointers := make([]interface{}, len(wantedRow))
for i, v := range wantedRow {
values[i] = reflect.Zero(reflect.TypeOf(v)).Interface()
pointers[i] = &values
}
if err := rows.Scan(pointers...); err != nil {
return err
}
</code></pre>
<p>The problem here is that each element of <code>pointers</code> is <code>*interface{}</code>, so the sql package (or the driver?) just decodes into whatever type it feels convenient ([]uint8 for strings, int64 for ints, etc) instead of the type that stored in <code>values</code>. I've also tried <code>pointers[i] = reflect.ValueOf(values[i]).Addr().Interface()</code> and various similar permutations, but these tend to fail either because values[i] fails <code>CanAddr()</code> (and thus panics) or because I'm creating pointers in <code>pointers</code> that are typed correctly, but don't point into <code>values</code>.</p>
<p>Any help here is appreciated!</p>
<hr/>**评论:**<br/><br/>hell_0n_wheel: <pre><p>If you're using <code>reflect</code> you're probably doing something wrong. Time to stop, step back, and think about what it is you're trying to do.</p>
<blockquote>
<p>want to build a test helper that takes a SQL query and some expected rows and asserts that they're equal</p>
</blockquote>
<p>Why? What exactly are you trying to test? Why do you need to compare the results of Scan() directly? Doesn't your application have a model of the data in the DB (as necessitated by the Separation of Concerns principle)?</p>
<p>This would make comparison easy: just populate your models and compare the models. Done.</p></pre>weberc2: <pre><blockquote>
<p>If you're using reflect you're probably doing something wrong. Time to stop, step back, and think about what it is you're trying to do.</p>
</blockquote>
<p>Fair enough; I'm open to suggestions.</p>
<blockquote>
<p>Why? What exactly are you trying to test? Why do you need to compare the results of Scan() directly? Doesn't your application have a model of the data in the DB (as necessitated by the Separation of Concerns principle)?</p>
</blockquote>
<p>I'm building a compiler for SQL. As such, I have no data models. I want to assert that my high-level query objects compile to SQL that does what I think it does. I want a helper function because I don't want to deal with the boilerplate involved with mucking around with <code>*sql.Rows</code>. I can't create a type that represents any row returned by any query I might want to test, so <code>reflect</code> seems like the next best thing.</p></pre>hell_0n_wheel: <pre><blockquote>
<p>I want to assert that my high-level query objects compile to SQL that does what I think it does</p>
</blockquote>
<p>You're using a database engine to test a SQL compiler? That's quite, erm, awkward. I'll admit, compiler design isn't my strong suit, but this test has some smell to it. Can you do this without invoking a database? Seems like that'd be a better way to go.</p>
<blockquote>
<p>I'm clearly not violating SoC here.</p>
</blockquote>
<p>Interfacing with a DB, marshalling the data returned from the DB, and inferring equality, I see three different concerns right there.</p></pre>weberc2: <pre><blockquote>
<p>You're using a database engine to test a SQL compiler? That's quite, erm, awkward. I'll admit, compiler design isn't my strong suit, but this test has some smell to it. Can you do this without invoking a database? Seems like that'd be a better way to go.</p>
</blockquote>
<p>Not reliably. I can test that it produces the SQL that I expect it to produce, but that's fragile and I care more that the SQL output <em>does</em> what I expect it to do.</p>
<blockquote>
<p>Interfacing with a DB, marshalling the data returned from the DB, and inferring equality, I see three different concerns right there.</p>
</blockquote>
<p>To be clear, this is a test helper function; my compiler doesn't depend on a SQL engine. It will spit out SQL strings all the livelong day; whether or not you pass them to a database is your business. My <em>tests</em> do pass them to a database, because that's the cheapest and most reliable way to validate my compiler. You can bet your ass that at some point, the Go compiler tests actually <em>run</em> the generated assembly as part of the test cycle--it doesn't just assert that the output assembly looks a certain way. :)</p></pre>hell_0n_wheel: <pre><blockquote>
<p>Not reliably</p>
</blockquote>
<p>In my view, depending on a 3rd party to do your validation is less reliable than testing inside your own source tree.</p>
<blockquote>
<p>I can test that it produces the SQL that I expect it to produce</p>
</blockquote>
<p>Shouldn't that be enough for a compiler? How's it fragile?</p></pre>weberc2: <pre><p>There are a couple of problems with not testing against a database. First is the fragility problem--my compiler should be free to optimize the SQL however it likes so long as the output is the same, and I don't want to go and rewrite a bunch of tests every time I add an optimization. Secondly, my users don't care that my compiler produces the output I think it should produce, they care that the compiler produces correct SQL. On the other hand, the downside of testing against a database is a relatively tiny bit of complexity in your test process; it would be foolish <em>not</em> to make this tradeoff.</p></pre>hell_0n_wheel: <pre><blockquote>
<p>my compiler should be free to optimize the SQL however it likes so long as the output is the same</p>
</blockquote>
<p>Ah, if your output is dynamic, then your unit test should have some simplified version of the compilation logic in it... that's what's specified in the golang book! No use trying to hit a moving target.</p>
<blockquote>
<p>my users don't care </p>
</blockquote>
<p>Uhh, you writing a requirements test, an integration test, or a unit test here?</p>
<p>Seems you're trying to do all of the above...</p></pre>weberc2: <pre><blockquote>
<p>Ah, if your output is dynamic, then your unit test should have some simplified version of the compilation logic in it... that's what's specified in the golang book! No use trying to hit a moving target.</p>
</blockquote>
<p>I'm not sure what you're proposing. Why would a unit test contain the logic it is validating? This all seems terribly complicated compared to simply passing the sql into the database.</p>
<blockquote>
<p>Uhh, you writing a requirements test, an integration test, or a unit test here?</p>
</blockquote>
<p>Depends on your definitions. I'm writing a lot of exhaustive tests with a database in the loop to assert that my compiler behaves as required.</p></pre>hell_0n_wheel: <pre><blockquote>
<p>Why would a unit test contain the logic it is validating?</p>
</blockquote>
<p>Specifically, a vastly simplified version of the logic. Why do this? To provide a verification of the unit's output, and ensure the unit's output doesn't drift over time. And as a bonus, it's not a fragile test.</p>
<blockquote>
<p>This all seems terribly complicated compared to simply passing the sql into the database.</p>
</blockquote>
<p>Not when you consider your simplified logic is several orders of magnitude less code than "the database". Also "the database's" code isn't exactly well controlled, in comparison to the logic in your test.</p>
<p>Another factor is the portability / ease of use of your test code. That's all much improved by using simple, local code for your tests, and not pulling third party dependencies.</p>
<blockquote>
<p>Depends on your definitions</p>
</blockquote>
<p>Not my definitions. They've been around for quite some time; see the classic Art of Software Testing (Myers) for one reference.</p></pre>weberc2: <pre><blockquote>
<p>Specifically, a vastly simplified version of the logic. Why do this? To provide a verification of the unit's output, and ensure the unit's output doesn't drift over time. And as a bonus, it's not a fragile test.</p>
</blockquote>
<p>I still don't follow. In particular, I don't know what you mean by "a vastly simplified version" or how I would use such a thing to validate the actual implementation, much less in a manner that precludes drift and fragility.</p>
<blockquote>
<p>Not when you consider your simplified logic is several orders of magnitude less code than "the database".</p>
</blockquote>
<p>Right, but I'm not maintaining the database; it's well tested and dependable.</p>
<blockquote>
<p>Also "the database's" code isn't exactly well controlled, in comparison to the logic in your test.</p>
</blockquote>
<p>I don't know what this could possibly mean; the database's code is almost certainly much better than mine. It's certainly better-scrutinized.</p>
<blockquote>
<p>Another factor is the portability / ease of use of your test code. That's all much improved by using simple, local code for your tests, and not pulling third party dependencies.</p>
</blockquote>
<p>This whole appeal is beginning to seem like dogma, all to avoid bringing in a dependency that we're going to need at some point anyway. It took me a few seconds to go get the test dependency and apart from spinning on the test fixture, I'm flying through tests. Beyond that, the tests will run on any platform with only DATABASE_DRIVER and DATABASE_CONNECTION_STRING variables. Plus I get a high degree of confidence knowing that everything works against the actual databases I'm targeting. Maybe what you're proposing is really a lot better, but the vague terms, lofty promises, and appeals to authority are tripping my BS detector.</p>
<blockquote>
<p>Not my definitions. They've been around for quite some time; see the classic Art of Software Testing (Myers) for one reference.</p>
</blockquote>
<p>I didn't mean to imply that you defined them, only that these terms get thrown out by different people meaning different things to the point that they're hardly clear. Frankly, the distinctions don't interest me that much any more; I care more about writing working code than I do about abiding by a particular testing philosophy.</p></pre>jerf: <pre><p>I'm not understanding why you can't use reflect.DeepEqual. I'm not saying you can, just that I don't understand why you can't. Possibly with a quick but of preprocessing to do some slight type tweaks.</p></pre>weberc2: <pre><p>The problem was building a list of pointers that I can pass into sql.Rows.Scan(); comparing is the easy part. In my sibling comment (downvoted to the bottom, for some reason), I explain the solution I found, which is to build a slice of pointers into the heap, pass them to sql.Rows.Scan(), and then iterate over them, dereferencing each and comparing it to its corresponding expected value. As you mention, I could build a <code>[]interface{}</code> of values and then do the deep equal with the expected row, but that would be more work for no obvious gain.</p></pre>forfuncsake: <pre><p>Another alternative is to use a package like this: <a href="https://github.com/jmoiron/sqlx" rel="nofollow">https://github.com/jmoiron/sqlx</a> </p>
<p>You could then use <code>.StructScan()</code> on the returned sqlx.Rows and compare to your static test data, which would be in a struct.</p></pre>weberc2: <pre><blockquote>
<p><a href="https://github.com/jmoiron/sqlx" rel="nofollow">https://github.com/jmoiron/sqlx</a></p>
</blockquote>
<p>I didn't know that existed, but wouldn't it be subject to pretty much the same problem? If I understand correctly, I'd need to use reflect to allocate a struct of the right type and get an address to it to pass into StructScan()?</p></pre>forfuncsake: <pre><p>No. If I read correctly, you indicated that you're testing SQL queries and you know what the return values /should/ be.</p>
<p>In that case, you just manually create a struct that represents the row you expect. Then let sqlx do its thing (which does involve reflection.. but you don't have to write it); and finally compare the scanned struct to the expected one. </p>
<pre><code>type myRow struct {
ID int
Name string
Value string
}
expect := myRow{ID: 1, Name: "foo", Value: "bar"}
var got myRow
err := db.QueryRowx(query).StructScan(&got)
if err != nil {
t.Fatalf("blah blah: %v", err)
}
if got != expected {
t.Fatal("yada yada")
}
</code></pre>
<p>You may also need to set db tags on the struct fields, or set the Mapper (all documented, if required).</p></pre>weberc2: <pre><p>Ah, I see. I think the overhead of defining an extra struct is not ideal for my use case, but it's nice to know that exists. Currently, this is what my tests look like:</p>
<pre><code>test(query, []string{"expectedCol1", "expectedCol2"}, [][]interface{}{{"foo", 42}, {"bar", 24}})
</code></pre></pre>weberc2: <pre><p>I think I solved it. Instead of creating a slice of values and then a slice of pointers to those values, I'm creating a slice of pointers to values on the heap:</p>
<pre><code>pointers := make([]interface{}, len(wantedRow))
for i, v := range wantedRow {
pointers[i] = reflect.New(reflect.TypeOf(v))
}
</code></pre>
<p>And then pass that into <code>rows.Scan()</code>, and then compare those values to the values in <code>wantedRow</code>:</p>
<pre><code>for i, v := range wantedRow {
actual := reflect.ValueOf(pointers[i]).Elem().Interface()
if v != actual {
return fmt.Errorf("wanted %v; got %v on row %d", v, actual, i)
}
}
</code></pre></pre>BurpsWangy: <pre><p>I'm confused. Isn't the entire purpose of SQL to use a specified query to pull out the data you need from a database? Why are you iterating through scan returns instead of specifying in the initial query what it is you're needing? Let T-SQL (or the like) do the processing for you. Don't pull bulk queries from a SQL database and then process the return in Go. You're doing it wrong...</p></pre>weberc2: <pre><blockquote>
<p>I'm confused. Isn't the entire purpose of SQL to use a specified query to pull out the data you need from a database? Why are you iterating through scan returns instead of specifying in the initial query what it is you're needing? Let T-SQL (or the like) do the processing for you. Don't pull bulk queries from a SQL database and then process the return in Go. You're doing it wrong...</p>
</blockquote>
<p>I'm not sure how you got that out of my posts. I'm writing tests for SQL queries, and the only way to adequately test SQL queries is to run them against the target database and assert that you get the right thing back. As such, I'm not "processing results" as you describe; I'm asserting that the result contains what I expect it to contain.</p></pre>BurpsWangy: <pre><p>Ah, gotcha. Are you using the standard SQL library, or constructing your own? I ask because if the standard one, there is already test files included to ensure the queries are implemented correctly. Just don't want you to put in a lot of time on something that may be redundant.</p></pre>weberc2: <pre><p>I'm using the standard SQL library, but I'm not familiar with those test files. I already have a working implementation, but I appreciate the effort! :)</p></pre>
这是一个分享于 的资源,其中的信息可能已经有所发展或是发生改变。
入群交流(和以上内容无关):加入Go大咖交流群,或添加微信:liuxiaoyan-s 备注:入群;或加QQ群:692541889
0 回复
- 请尽量让自己的回复能够对别人有帮助
- 支持 Markdown 格式, **粗体**、~~删除线~~、
`单行代码`
- 支持 @ 本站用户;支持表情(输入 : 提示),见 Emoji cheat sheet
- 图片支持拖拽、截图粘贴等方式上传