<p>Hey all,</p>
<p>I'm working on a Go project that uses a database and I was wondering what the community considers as the best tool for database migrations.</p>
<p>I would like to easily be able to reproduce the schema so that I can also write (integration) tests that run against the database.</p>
<p>I was trying to keep things simple and use only the standard library but it seems I have no choice when it comes to migrations.</p>
<p>I suppose I could also have the parts of the schema that I need for each test in my Go file but I suspect it will get messy very soon.</p>
<p>Thank you</p>
<hr/>**评论:**<br/><br/>F21Global: <pre><p>I recently went through this myself. The most popular ones are <a href="https://github.com/mattes/migrate/" rel="nofollow">https://github.com/mattes/migrate/</a> and <a href="https://bitbucket.org/liamstask/goose" rel="nofollow">https://bitbucket.org/liamstask/goose</a></p>
<p>Unfortunately, they do not appear to be maintained. The ones that are currently maintained are:</p>
<ul>
<li><a href="https://github.com/rubenv/sql-migrate" rel="nofollow">https://github.com/rubenv/sql-migrate</a></li>
<li><a href="https://github.com/DavidHuie/gomigrate" rel="nofollow">https://github.com/DavidHuie/gomigrate</a></li>
</ul>
<p>Unfortunately, those 2 are heavily tied to <code>database/sql</code> and the gorp ORM, which was unsuitable for me so I made my own:</p>
<ul>
<li><a href="https://github.com/Boostport/migration" rel="nofollow">https://github.com/Boostport/migration</a></li>
</ul></pre>-Nii-: <pre><p>How do you consider them unmaintained? They both seem reasonably up to date.</p></pre>F21Global: <pre><p>mattes/migrate's last commit was in march. It's got a bunch of PRs that aren't being merged. Although there was a fork that's aiming to take over development, I ended up having to create my own library as getting it to support embedded migrations would basically require a rewrite.</p>
<p>Goose was last updated in January 2015 and there are also heaps of unmerged PRs. Unfortunately, there doesn't appear to be a fork that's under development.</p></pre>tty5: <pre><p>We maintain somewhat modified fork of goose: <a href="https://github.com/pressly/goose" rel="nofollow">https://github.com/pressly/goose</a> that allows you to compile all go migrations into a binary - so it can be run in a container without having a working go in it.</p></pre>weberc2: <pre><p>Thanks for posting this! Migrations are important. It sounds like there is a lot of interest in this thread to fork and maintain a tool. Maybe you could consolidate efforts?</p></pre>kaneshin: <pre><p>I used to use <a href="https://bitbucket.org/liamstask/goose" rel="nofollow">https://bitbucket.org/liamstask/goose</a> to apply sql files, but goose has some bugs.
So, I created <a href="https://github.com/eure/kamimai" rel="nofollow">https://github.com/eure/kamimai</a>, which supports only MySQL driver now.</p>
<p>May you want other drivers?</p></pre>xiegeo: <pre><p>I'm not sure what you need, migrations and testing? Are these requirements related?</p>
<p>Also, are you migrating (changing the database schema) once, like on a server, or multiple times, like with client side databases?</p>
<p>Personally, when adding features to a server and changing the schema when needed. I have just added to my schema like the following (sqlite example):</p>
<pre><code>var creatLogTable = `
CREATE TABLE if not exists uilog (
...
);
CREATE INDEX if not exists time_idx ON uilog (time);
-- alter table - add column must be used in reverse order, when column already exists, script stops.
alter table uilog add column lastTime INTEGER;
alter table uilog add column count INTEGER;
alter table uilog add column build STRING;`
</code></pre>
<p>In the above script, I added build in v2, count in v3 and lastTime in v4. The script can create a new database or migrate any older version of the database to v4. This allows me to recover from an old database without worrying about compatibility.</p>
<p>Depending on your database, you might have even better scripting support for database migration. A benefit of doing migration in SQL with a static script is that the Go code never have to worry about older databases.</p></pre>neoasterisk: <pre><blockquote>
<p>I'm not sure what you need, migrations and testing? Are these requirements related?</p>
</blockquote>
<p>Greetings,</p>
<p>I am not entirely sure myself what exactly is needed. I suppose the best way to explain it is that I am trying to figure out "best practices" when working with databases in Go. Writing code that communicates with a database is pretty much straightforward but what about testing and migrating the schema?</p>
<p>Are testing and migrations related? In my mind they are. If I start writing tests without solving the migration problem first then in my tests I am gonna have a bunch of SQL which will require changes and maintenance as the project grows. With migrations, I can at least make sure that my schema is just in 1 place for the whole project.</p>
<p>That's pretty much how I see things. Maybe I am wrong.</p>
<blockquote>
<p>Also, are you migrating (changing the database schema) once, like on a server, or multiple times, like with client side databases?</p>
</blockquote>
<p>I was trying to keep things simple and keep some SQL scripts in my Go files but that doesn't seem to be very maintenable. Ideally it should be best if the schema can be reproducible in both the client and the server. This is also required for testing. Each test might have to recreate a quick test database and add the schema. I do not know the overhead of that. It might be needed to first create the database + schema, then run all the tests and then cleanup.</p>
<blockquote>
<p>Depending on your database, you might have even better scripting support for database migration.</p>
</blockquote>
<p>The database has not been decided decided yet but it's probably going to be either postgres or mySQL (slightly leaning towards postgres).</p>
<blockquote>
<p>A benefit of doing migration in SQL with a static script is that the Go code never have to worry about older databases.</p>
</blockquote>
<p>I am not quite sure what you mean by that. What if you have in your static script some SQL code that only runs on a specific mySQL version but not earlier versions? I recently had a special case where I had an SQL query in my Go code which was running just fine on the server and my laptop (both with mySQL 5.5) then when I updated to mySQL to 5.7 on my laptop, the query stopped working. Of course the query itself was a little weird and it probably was an exception but my point is that you can never be sure. (By the way, this is one of the reasons why we are leaning towards the postres side vs mySQL.)</p>
<p>Unfortunately, I am sad to realize, (based on the responses here and some research I've done) that Go tooling when it comes to databases, doesn't seem to be mature enough. Two of the responders here have rolled their own libraries because the existing ones either do not cover their use cases or they are unmaintained. This is worrying me a lot.</p>
<p>Furthermore at the time of writing there are not many responses either. Maybe this is a trivial problem that most people know how to solve and do not care to reply? Maybe they aren't testing their database code? Maybe they haven't figured out best practices yet? I do not know but this is making me worry.</p>
<p>The sad reality is that, no matter how much I love Go and how much I want to push it as a good use case for the project (the other candidate is Java), if the database tooling does not help then there's no way to compete. Go's advantages of simplicity, readability (and hopefully maintenability) might not be enough to convince the others. For better or worse, Java and it's tooling are much more well understood in the enterprise world. I am not saying that I want the equivalent of Hibernate or Spring in Go, in fact I'd prefer to keep things as simple as possible but some good and mature tooling can help a lot.</p>
<p>After some more research, I am realizing that <a href="https://bitbucket.org/liamstask/goose/" rel="nofollow">goose</a> seems to be the most mature tool for migrations (Though <a href="/u/F21Global" rel="nofollow">/u/F21Global</a> 's answer gave me some doubts for sure). I also stumbled upon this article which explains how to do <a href="https://divan.github.io/posts/integration_testing/" rel="nofollow">Integration testing in Go using Docker</a>. This seems like a very good solution to most problems aka having a docker container which runs the same OS version as the server and have all your integration tests run on that. Unfortunately I do not know docker well enough, I do not know how fast this will be and this solution seems to be introducing more complexity in the project.</p>
<p>I want to thank everyone for the responses (<a href="/u/kaneshin" rel="nofollow">/u/kaneshin</a>, <a href="/u/xiegeo" rel="nofollow">/u/xiegeo</a>, <a href="/u/HectorJ" rel="nofollow">/u/HectorJ</a>, <a href="/u/F21Global" rel="nofollow">/u/F21Global</a>). I am gonna keep researching and analyzing the situation. Hopefully I'll figure out a good way to do all those things and I'll keep checking this thread for more responses. In the worst case, the project will be done in Java. Hey, it's not gonna kill me to write Java but I was looking forward to write some Go during the day instead.</p></pre>xiegeo: <pre><blockquote>
<p>testing </p>
</blockquote>
<p>Testing in Go are mostly unit tests. Many would argue integration testing should be done separately, especially when you use an external database. Integration testing is much higher level, they are just as complected no mater which technologies you use, so I don't see that as impacting the choice of using one language over another.</p>
<blockquote>
<p>I was trying to keep things simple and keep some SQL scripts in my Go files but that doesn't seem to be very maintenable.</p>
</blockquote>
<p>I actually find this way of working reasonable for my own simple use cases, too much tooling is need do it another way. I just write my SQL in a gui based client, test against my database, then copy into my .go files with sample values replaced by parameters.</p>
<blockquote>
<p>Ideally it should be best if the schema can be reproducible in both the client and the server. </p>
</blockquote>
<p>This is not a common architecture. Is the client directly accessing SQL or maintaining a cache database? Normally, I would hide the database behind an API that use an object oriented serialization protocol such at JSON, and cache the API calls instead of maintaining a SQL database on the client.</p>
<blockquote>
<p>What if you have in your static script some SQL code that only runs on a specific mySQL version but not earlier versions?</p>
</blockquote>
<p>This problem never accord to me. I haven't throughout about outdated dependence since using Go. Ether update or develop and test your scripts on the oldest version you want to support. </p>
<blockquote>
<p>when I updated to mySQL to 5.7 on my laptop, the query stopped working. </p>
</blockquote>
<p>A point release breaking compatibility is not something you can or should be defending against. I can't really help you there.</p>
<p>IMHO, SQL is a domain specific language for doing relational algebra for structured data at rest. The only tooling a language need is query parameterization, execution, and deserialization. We choose SQL to do what SQL is good at, which is orthogonal to what many programing languages, especially Go, is good at. The best way to use SQL is to use SQL as is, and the best practice for SQL is the best practice for SQL anywhere.</p></pre>neoasterisk: <pre><blockquote>
<p>I actually find this way of working reasonable for my own simple use cases, too much tooling is need do it another way. I just write my SQL in a gui based client, test against my database, then copy into my .go files with sample values replaced by parameters.</p>
</blockquote>
<p>I am with you on this. That's what I've doing as well but it doesn't seem to be such a good solution for more complex cases. For example how do you deal with your SQL scripts being all over the place when things get added in the project and your schema changes? Obviously you can go and change the scripts that need changing but if you keep SQL scripts spread around a dozen files, it makes the project harder to maintain and it's only going to get harder as the project grows.</p>
<p>Nevertheless I might have to stick with this method for a while as I find using docker for testing a little bit too complex. By the way, do you have any of your SQL testing code open sourced that I could check or maybe know any (big) open source project that has some? That might help me and my team a lot in this quest.</p>
<blockquote>
<p>This is not a common architecture. Is the client directly accessing SQL or maintaining a cache database? Normally, I would hide the database behind an API that use an object oriented serialization protocol such at JSON, and cache the API calls instead of maintaining a SQL database on the client.</p>
</blockquote>
<p>My bad, my explanation wasn't clear. I just meant that each developer that works on the project should be able to easily reproduce the whole schema and they should be able to do that on both the dev/staging/production server and their laptops (I used the term client there by mistake). If the schema is easily reproducible then that will also help with the (integration) tests.</p>
<blockquote>
<p>This problem never accord to me. I haven't throughout about outdated dependence since using Go. Ether update or develop and test your scripts on the oldest version you want to support.</p>
</blockquote>
<p>As I mentioned in the previous post, that query was a very specific case and it was probably the exception to the rule. Thing is, in Go if you want to keep things simple (and use the standard library for your database needs) then you are more or less stuck with writing native SQL queries. I personally have no problem with that. In fact I much more prefer doing it that way than having to deal with the constant complexity of Hibernate. Nevertheless it's also a fact that if a database update changes the SQL queries that are valid (for the specific database), if you happen to have such an unfortunate case then your native SQL query will fail, whereas with something like Hibernate you wouldn't have that problem. Don't get me wrong, I am not trying to support Java. I must more prefer developing in Go but when trying to convince others about Go being a good candidate for the project, such things matter a lot.</p>
<p>While on this subject, I am now evaluating this apparently awesome <a href="https://github.com/jmoiron/sqlx" rel="nofollow">package</a> to see if it is worth having it as a dependency vs using the standard library.</p>
<blockquote>
<p>IMHO, SQL is a domain specific language for doing relational algebra for structured data at rest. The only tooling a language need is query parameterization, execution, and deserialization. We choose SQL to do what SQL is good at, which is orthogonal to what many programing languages, especially Go, is good at. The best way to use SQL is to use SQL as is, and the best practice for SQL is the best practice for SQL anywhere.</p>
</blockquote>
<p>Hmm, i hear you but I can't say I agree 100% on this. If it was up to me, I'd be using native SQL queries in all the projects. I much more prefer to see exactly what is going on than have my queries expressed in some kind of DSL like they do in Hibernate. I totally agree that by using native SQL queries you have total control and it's definitely the best way to use SQL. </p>
<p>On the other hand, this is not about the best practice for SQL. This is the best practice for working with databases in Go. This is entirely different than working with databases in Java or say Ruby. I don't think anyone expects you nowadays to take any serious Java project and not use something like Spring and Hibernate. Similar case with Ruby on Rails. As an example, I can tell you that using Spring+Hibernate for a lot of cases you don't even have to write database code. You just get it for free. I am not saying that I want something like this in Go. No. Go has a different mindset and ethos. Go's advantage is simplicity.</p>
<p>My point is that, the best practices for working with databases in Go and Java are completely different. Java is much older and has had more time for these best practices to develop. In the case of Go it doesn't seem we are quite there yet.</p>
<p>As I previously mentioned, apart from the fact that I am evaluating using the <a href="https://github.com/jmoiron/sqlx" rel="nofollow">sqlx</a> package, I am not arguing about using the standard library for database access in Go. This seems to be the "best practice". But I still have yet to find a good way to write the database (integration) tests and the migrations.</p>
<p>I highly appreciate this talk we are having as it is helping me on my research so thank you very much for that. :)</p></pre>xiegeo: <pre><p>I am glad to talk to you too, it also helps me think about how I code.</p>
<blockquote>
<p>but if you keep SQL scripts spread around a dozen files</p>
</blockquote>
<p>So far I haven being able to keep them inside dedicated files, which are inside a dedicated package that does all the data modeling. </p>
<blockquote>
<p>if a database update ... query will fail,</p>
</blockquote>
<p>If you are careful, there are updates that does not break old query. If all your SELECT and INSERT explicitly state their columns, then you can safely add a default valued column to a table. Adding new tables is also safe. These two covers most if not all of my schema updates.</p>
<blockquote>
<p>each developer</p>
</blockquote>
<p>I see, that is a hard problem. Coordinating developers is a full-time job in it self.</p>
<blockquote>
<p>sqlx</p>
</blockquote>
<p>That looks like a good library. Pity I have not used it. It would have simplified my sql Rows.Scan statements, probably the most painful part of using the stranded library.</p>
<blockquote>
<p>But I still have yet to find a good way to write the database (integration) tests and the migrations.</p>
</blockquote>
<p>I usually find better ways during development, not having everything figured out at the start. During a recent project, I just wrote code and refactors afterwards when I figured out how i should test. Unless I am working on a very familiar space, I just don't have the brain power to consider everything. I have developed an intuition for keeping thing lightly coupled (Go interfaces and forbidden of cyclical dependencies are great teachers), so refactoring in Go became something I look forward to. </p>
<p>It is also possible that you find you don't need complicated migration tools as the most simple scripts would do. And that there are few bugs that require a database integration test to reproduce. It is more of a prayer than prophesy, but I still think it could be true.</p></pre>HectorJ: <pre><p>Currently using <a href="https://github.com/rubenv/sql-migrate" rel="nofollow">https://github.com/rubenv/sql-migrate</a> </p></pre>
这是一个分享于 的资源,其中的信息可能已经有所发展或是发生改变。
入群交流(和以上内容无关):加入Go大咖交流群,或添加微信:liuxiaoyan-s 备注:入群;或加QQ群:692541889
- 请尽量让自己的回复能够对别人有帮助
- 支持 Markdown 格式, **粗体**、~~删除线~~、
`单行代码`
- 支持 @ 本站用户;支持表情(输入 : 提示),见 Emoji cheat sheet
- 图片支持拖拽、截图粘贴等方式上传