Can gorm (or any other Go pseudo-ORM) use SQLite for tests and another database for production?

agolangf · · 600 次点击    
这是一个分享于 的资源,其中的信息可能已经有所发展或是发生改变。
<p>Other languages such as Java or Python commonly run unit tests against an in-memory or flat-file database (e.g. H2 or SQLite), and then switch to an external database for production (e.g. PostgreSQL or MySQL). Most ORM&#39;s make this a simple matter of configuration, with perhaps some extra hoops to jump through if you need to use raw SQL.</p> <p>I&#39;m not sure if this possible with any of the Go ORM&#39;s (or rather psuedo-ORM&#39;s, as the language is so fundamentally different). I&#39;m tinkering with &#34;gorm&#34; right now, and it seems to require you to import your database-specific dialect right there in your source when importing the gorm library itself. I&#39;m not sure how you would go about applying a SQLite dialect in your unit tests file, and a MySQL dialect in the main code.</p> <p>Are there any examples out there, with &#34;gorm&#34; or any other psuedo-ORM, of unit testing against SQLite when that&#39;s not your production database?</p> <p>For what it&#39;s worth, I have no interest in using an ORM to autogenerate tables for me... and I&#39;m not using any database-specific functionality like <code>auto_increment</code>. However, column types would differ between the databases (e.g. <code>BINARY(16)</code> for MySQL, and <code>BLOB</code> for SQLite, etc).</p> <hr/>**评论:**<br/><br/>robvdl: <pre><p>Almost always a terrible idea, yes you can... but I have seen this question getting asked <em>all the time</em> in Django projects and it never ends well, Postgres and SQlite are just too different, always test against the DB you will be using.</p></pre>eikenberry: <pre><p>For unit testing there should&#39;t be an external database involved at all, it should always be mocked/faked/stubbed. Testing against the database should be a saved for integration or system level testing.</p></pre>dlsniper: <pre><p>Not really, there&#39;s no rule for that. Also the point of testing against a DB directly is that you don&#39;t have to waste all that time and mock something that you&#39;ll need to test against anyway and, more importantly, it will allow you to catch bugs that otherwise you might not catch (especially if you don&#39;t do integration tests).</p> <p>Imho, just test against theDB you&#39;ll be using and that&#39;s it, it will save you a lot more time and headaches down the road. </p></pre>Twirrim: <pre><p>It&#39;s not that difficult to set up a local database to test against.</p></pre>gnhuy91: <pre><p>Agreed, Mocking misses a lot of potential issues like schema/query mismatch, invalid SQL and more. Why wait until integration tests when you can just run local tests against a real DB before commit?</p></pre>tmornini: <pre><p>Because cycle time is everything.</p> <p>Your argument makes a ton of sense when the project is small, but as it gets bigger, splitting unit and integration tests is the difference between being good and being great.</p> <p>Another advantage to mocking: the ability to test error paths, and the requirement to write code simple enough to mock easily...</p></pre>s-expression: <pre><blockquote> <p>Mocking misses a lot of potential issues like schema/query mismatch, invalid SQL and more</p> </blockquote> <p>That&#39;s what integration testing is for.</p> <p>Mocking is so you can test business logic. That includes injecting <em>bad</em> data and making sure you can handle it. This is pretty tedious when testing with a real database.</p> <p>To bring this back around to OP&#39;s question, mocking against a <em>different</em> database like sqlite is almost always pointless.</p></pre>robvdl: <pre><p>Yes I realise this, I was mostly comparing it to Django testing, which tends to be using a lot of functional testing rather than unit testing... mostly because that is what the docs point people to when first staring out.</p></pre>redditter15: <pre><p>Test against the DBMS you use in production. Don&#39;t use SQLite if your production DBMS is PostgreSQL.</p> <p>Create a test database to run your tests. You can load <a href="https://github.com/go-testfixtures/testfixtures" rel="nofollow">test fixtures</a> with sample data to have some data to test against.</p></pre>usernameliteral: <pre><p>I would suggest doing integration testing instead. Put your MySQL server in a Docker container (or something else) and run tests that use the db against that.</p></pre>Jwsonic: <pre><p>I&#39;ve been doing exactly what you described. The technique I use is to create a function(or even better, an interface) that takes a pointer to a gorm.DB such as:</p> <pre><code>func getUser(db *gorm.DB) User </code></pre> <p>In testing you create a gorm.DB using sqlite and pass it to the function. In prod you create a Postgres/MySQL gorm.DB and pass it instead. </p> <p>I recommend reading <a href="http://nathanleclaire.com/blog/2015/10/10/interfaces-and-composition-for-effective-unit-testing-in-golang/" rel="nofollow">this article</a>. It has some good ideas to help you wrap your head around writing unit tests in Go.</p></pre>BadMoonRosin: <pre><p>Thanks! That&#39;s a promising direction to explore. Do you have to have any public projects applying this pattern up on GitHub or elsewhere?</p> <p>I&#39;m somewhat surprised by all of the comments suggesting integration testing &#34;instead of&#34; unit testing. It&#39;s not standard practice to do <em>both</em>? Unit testing and integration testing really do serve different purposes.</p></pre>Manbeardo: <pre><p>The suggestion is that your unit tests should sit one layer above gorm. The gorm db drivers don&#39;t have identical behavior, so a green pass doesn&#39;t mean it will work in production and test fails don&#39;t mean it will break.</p></pre>nosmileface: <pre><p>As a hack you can also put MySQL DB files onto tmpfs (in-memory filesystem, in many linux distros mounted to /tmp and /dev/shm). And here you go, an in-memory MySQL database.</p></pre>Twirrim: <pre><p>SQLite speaks a very distinct dialect of SQL, and lacks, for example, strict type checking. <a href="https://www.sqlite.org/faq.html#q3" rel="nofollow">https://www.sqlite.org/faq.html#q3</a></p> <p>It might suffice to some extent for developing, but you absolutely must test your code against the same database software as you run in production. </p> <p>Even then, it&#39;s worth pointing out that every DB implements and expands on SQL in their own peculiar ways, and every DB engine has very different performance characteristics.. Code that runs fast against SQLite might be slow in Postgres or MySQL.</p></pre>BadMoonRosin: <pre><p>Again... I&#39;m just blown away by the assumption in this thread that one does unit testing OR integration testing, rather than both.</p> <p>OF COURSE you run integration tests against the same database type as production before deploying. However, you&#39;ll <em>also</em> typically have a suite of more low-level unit tests... which you&#39;ll run 1,000 times a day on your local laptop, and which will run on a CI server that ideally doesn&#39;t a database instance running on it.</p> <p>You certainly can have your unit tests abstract away the database. Just write a separate data-access layer for testing, which stores and retrieves entities from in-memory <code>map</code>&#39;s or something. However, if you have access to a true SQL in-memory database, then it&#39;s one less piece of test scaffolding that you have to write. Even if its SQL dialect and behavior is different (e.g. SQLite), then at least there&#39;s hopefully <em>less</em> you&#39;ll have to write in your scaffolding layer.</p> <p>I&#39;m a bit spoiled by Java, which has H2... an embedded database that&#39;s capable of emulating the behavior of MySQL, PostgreSQL, Oracle, etc. So unit testing your data layer is nearly trivial. I&#39;m gathering that Go doesn&#39;t currently have anything like this, and I am starting to agree that SQLite doesn&#39;t really fit this niche. Its SQL dialect and dynamic typing are weird, and it requires that you have a C compiler to build any of the Go drivers.</p> <p>So apparently, the options at this point in time are:</p> <ol> <li><p>Abstract away the database layer, by writing some test scaffolding that stores and retrieves records from in-memory <code>map</code>&#39;s, or</p></li> <li><p>Don&#39;t unit test. Just run integration tests against the target DB.</p></li> <li><p>Go nuts with Docker... which is still kinda integration testing since now you need Docker available on the CI server.</p></li> </ol> <p>I&#39;ll probably go with #1. It sounds like most of the people in this thread are in the #2 camp. You certainly can make a case for that approach, but it isn&#39;t unit testing.</p></pre>Twirrim: <pre><p>I&#39;m blown away by you somehow inferring a one or other approach from what I said.... My point was just to be careful. If you develop against SQLite locally you&#39;re getting different behaviour and performance from your development target.</p> <p>I&#39;m not sure quite why you&#39;re jumping to docker there. It&#39;s ridiculously simple to run a local instance of postgres or mysql to develop against, no need to involve docker in the mix.</p></pre>tmornini: <pre><ol> <li><h2></h2></li> <li><h2></h2></li> <li><h2></h2></li> <li><p>Don&#39;t test state in unit tests, test behavior.</p></li> </ol> <p>If you call gorm with the correct parameters, you&#39;re good -- so long as your integration tests work as well.</p> <p>Testing that the DB stores and retrieves items properly is not the best way to write unit tests...</p></pre>znpy: <pre><p>I&#39;ll share my experience.</p> <p>I am developing a (small?) web application, and I use PostgreSQL in &#34;production&#34;. In order to run my tests faster, I tried running tests agains an in-memory sqlite db.</p> <p>The thing is: while tests might run faster, the whole compilation time gets a lot slower.</p> <p>This is because while libpq is pure-go and compiles super fast, the sqlite library compiles C stuff underneath (and you&#39;ll end up recompiling the same stuff every time).</p> <p>Also, you&#39;ll build a dependency you don&#39;t truly need, and you might not catch your-db-specific bugs.</p> <p>Since I run my own Jenkins on one of my boxes, my solution was to spawn a postgresql docker container (with specific parameters), run my test and finally delete the docker container.</p> <p>I must say, compilation is now fast again and it works extremely well.</p> <p>Here are some notes: <a href="https://znpy.wordpress.com/2016/04/25/testing-go-applications-with-postgresql-and-docker/" rel="nofollow">Testing Golang Applications with Docker and Docker</a></p></pre>

入群交流(和以上内容无关):加入Go大咖交流群,或添加微信:liuxiaoyan-s 备注:入群;或加QQ群:692541889

600 次点击  
加入收藏 微博
暂无回复
添加一条新回复 (您需要 登录 后才能回复 没有账号 ?)
  • 请尽量让自己的回复能够对别人有帮助
  • 支持 Markdown 格式, **粗体**、~~删除线~~、`单行代码`
  • 支持 @ 本站用户;支持表情(输入 : 提示),见 Emoji cheat sheet
  • 图片支持拖拽、截图粘贴等方式上传