I'm still new and trying to figure out how to do integration tests that require DB access for the past 2-3 days already and I've got nowhere. I dont know how to structure my code for such tests.
This is how I usually structure my code if I dont do tests and it sticked from a codebase I found on github:
https://play.golang.org/p/xUwpgnIU2Ko
If anyone can give me some advice like how to structure it better, make it easier for testing then I'd really appreciate it. I'm self learning and having bit hard time with testing
评论:
sacrehubert:
hybsuns:Try using the DAO pattern. The basic idea is that your database object should be wrapped in an interface that exposes business-logic operations. For instance:
type DAO interface { FetchWidget() (*Widget, error) NewWidget() (*Widget, error) StoreWidget(*Widget) error }
You would then implement something like a
postgresDAO
that satsifiesDAO
.The win for you is that you can create implementations of your DAO for testing. These don't have to be backed with databases at all (which is preferable in many cases).
never-_-laugh:I believe there are some serious issues in the architecture of your project which contribute to the difficulty of testing.
Firstly, you put you entity
Movie
and the data access codefunc (m *movieRepo) Read() ([]Movie, error)
in the same layer of the system. This is generally a bad design sinceMovie
is the business logic andfunc (m *movieRepo) Read() ([]Movie, error)
is a detail (data access). They should be put into separated packages (at least).The second issue is that your data access code is tightly coupled with package
app
. This is not desirable since your data access code is already a low level module and deals with I/O. In your case, you cannot instantiate a repository unless you instantiate anApp
. However, youApp
is really just a connection to the database. I would question if adding such a layer of abstraction is necessary.I believe your data access code should be structured like this:
package dataaccess
type PostgresMovieRepository struct {
Database *sql.DB
}
func (m PostgresMovieRepository) Read() ([]Movie, error) {
// query movies
return movies, nil
}
In your test code, you can do something like this:
package dataaccess_test
var movieRepo = dataaccess.PostgresMovieRepository{
Database: CreateYourDBConnectionHere(),
}
func TestPostgresMovieRepository_Read(t *testing.T) {
movies, err :=
movieRepo.Read
()
// add your assertion statements here
}
Note that your connection can be a actual DB connection or a mock connection.
I would suggest looking into dependency inversion principle (the D in SOLID principle). This principle can guide you how to structure your code.
f12_amish:Have you considered using docker? You could stand up a docker env and populate the db with sample data to run your tests. This would just get destroyed every time and keep your live db safe. There is obviously a lot more detail to this and setting it up would require you to familiarize with docker, docker-compose, and maybe write a bash script. The end result is a good way to reproduce a similar env as what your production env looks like and allow you to keep the live db clean.
bschwind:I'm just getting into docker myself and wondered how you give the app access to a docker container. Do you just setup docker on a specified port and have a static DB URL where the app checks for the database?
never-_-laugh:Do you just setup docker on a specified port and have a static DB URL where the app checks for the database?
This is pretty much how it works. Docker containers can bind to ports on the host network, so a postgres container can bind to port 5432 or any other port you want to specify, and your app will just connect to it like any other database endpoint.
earthboundkid:That is one approach. You can also create a docker network but that may be more than what you need. I would definitely recommend reading the the latest docker docs. I would also say be careful with some of the tutorial people post. Not that they are bad but be mindful of the version of docker-compose they use. There have been added features that change the ideal work flow with docker-compose. So I would try to stick with the official docs or anything that is at least posted very recent.
p4r14h:https://medium.com/@benbjohnson/standard-package-layout-7cdbc8391fc1
So in practice, when you have multiple environments like dev, staging (integ) and prod you’d have a config class that stores all the runtime configuration (like your SQL DSN). You can then set an ENV var like ‘COMPANY_ENV=production’ which would affect which config file is loaded and serialized into your config class.
For instance, we have:
base.yaml dev.yaml production.yaml
In my dev.yaml I have a set of nested structures: ‘datastore.mysql.dsn’
In my main this file is loaded and turned into a config struct that is passed around (similar to your app type) and in my db.go I use Gorm to either create a local SQLite DB for dev or give it a remote connection string for staging/production.
For integration tests you probably want to use the same SQL server as production so you’d need a separate SQL database available in your staging environment. When you start the test suite you can bootstrap the DB and schemas- the only difference is you’d pass in a staging config struct then execute the code you want to test and verify the expected result.
