NAME
DBIx::Class::Migration - Use the best tools together for sane database migrations
SYNOPSIS
use DBIx::Class::Migration;
my $migration = DBIx::Class::Migration->new(
schema_class => 'MyApp::Schema',
schema_args => \@connect_opts);
Alternatively:
use DBIx::Class::Migration;
use MyApp::Schema;
my $migration = DBIx::Class::Migration->new(
schema => MyApp::Schema->connect(@connect_opts));
Informational Commands:
$migration->status;
Preparing and using Migrations:
$migration->prepare;
$migration->install;
$migration->upgrade;
$migration->downgrade;
Commands for working with Fixtures:
$migration->dump_named_sets;
$migration->dump_all_sets;
$migration->populate;
Utility Commands:
$migration->drop_tables;
$migration->delete_table_rows;
$migration->make_schema;
$migration->install_if_needed;
DESCRIPTION
DBIx::Class::DeploymentHandler is a state of the art solution to the problem of creating sane workflows for versioning DBIx::Class managed database projects. However, since it is more of a toolkit for building custom versioning and migration workflows than an expression of a particular migration practice, it might not always be the most approachable tool. If you are starting a new DBIx::Class project and you don't have a particular custom workflow need, you might prefer to simply be given a reasonable clear and standard practice, rather than a toolkit with a set of example scripts.
DBIx::Class::Migration defines some logic which combines both DBIx::Class::DeploymentHandler and DBIx::Class::Fixtures, along with a standard tutorial, to give you a simple and straightforward approach to solving the problem of how to best create database versions, migrations and testing data. Additionally it builds on tools like Test::mysqld and Test::postgresql along with DBD::Sqlite in order to assist you in quickly creating a local development database sandbox. It offers some integration points to testing your database, via tools like Test::DBIx::Class in order to make testing your database driven logic less painful. Lastly, we offer some thoughts on good development patterns in using databases with application frameworks like Catalyst.
DBIx::Class::Migration offers code and advice based on my experience of using DBIx::Class for several years, which hopefully can help you bootstrap a new project. The solutions given should work for you if you want to use DBIx::Class and have database migrations, but don't really know what to do next. These solutions should scale upward from a small project to a medium project involving many developers and more than one target environment (DEV -> QA -> Production.) If you have very complex database versioning requirements, huge teams and difficult architectual issues, you might be better off building something on top of DBIx::Class::DeploymentHandler directly.
DBIx::Class::Migration is a base class upon which interfaces like DBIx::Class::Migration::Script are built.
Please see DBIx::Class::Migration::Tutorial for more approachable documentation. If you want to read a high level feature overview, see DBIx::Class::Migration::Features. The remainder of this POD is API level documentation on the various internals.
ATTRIBUTES
This class defines the following attributes.
db_sandbox_builder_class
Accept Str. Defaults to 'DBIx::Class::Migration::TargetDirSandboxBuilder'
The name of the helper class which builds the class that builds database sandboxs. By default we build database sandboxes in the "target_dir", which is what DBIx::Class::Migration::TargetDirSandboxBuilder does. We can also build database sandboxes in a temporary directory using DBIx::Class::Migration::TempDirSandboxBuilder. You might prefer that for running tests, for example.
db_sandbox_class
Accepts Str. Not Required (defaults to 'DBIx::Class::Migration::SqliteSandbox').
Unless you already have a database setup and running (as you probably do in production) we need to auto create a database 'sandbox' that is isolated to your development local. This class is a delegate that performs this job if you don't want to go to the trouble of installing and setting up a local database yourself.
This must point to a class that expects target_dir
and schema_class
for initialization arguments and must define a method make_sandbox
that returns an array which can be sent to "connect" in DBIx::Class::Schema.
This defaults to DBIx::Class::Migration::SqliteSandbox. Currently we have support for MySQL and Postgresql via DBIx::Class::Migration::MySQLSandbox and DBIx::Class::Migration::PgSandbox, but you will need to side install Test::mysqld and Test::postgresql (In other words you'd need to add these Test::*
namespace modules to your Makefile.PL
or dist.ini
).
db_sandbox
Accepts: Object. Not required.
This is an instantiated object as defined by "db_sandbox_class". It is a delegate for the work of automatically creating a local database sandbox that is useful for developers and for quickly bootstrapping a project.
schema_class
Accepts Str. Not Required (but if missing, you need to populate "schema").
This is the schema we use as the basis for creating, managing and running your deployments. This should be the full package namespace defining your subclass of DBIx::Class::Schema. For example MyApp::Schema
.
If the "schema_class" cannot be loaded, a hard exception will be thrown.
schema_args
Accepts ArrayRef. Required but lazily builds from defaults
Provides arguments passed to connect
on your "schema_class". Should connect to a database.
This is an arrayref that would work the same as "connect" in DBIx::Class::Schema. If you choose to create an instance of DBIx::Class::Migration by providing a schema_class, you can use this to customize how we connect to a database.
If you don't provide a value, we will automatically create a SQLite based database connection with the following DSN:
DBD:SQLite:[path to target_dir]/[db_file_name].db
Where c<[path to target_dir]> is "target_dir" and [db_file_name] is a converted version of "schema_class". For example if you set schema_class to:
MyApp::Schema
Then [db_file_name] would be myapp-schema
.
Basically, this means you can start testing your database designs right off without a lot of effort, just point at a schema_class and get deploying!
schema
Accepts: Object of DBIx::Class::Schema. Not required.
If you already have a connected schema (subclass of DBIx::Class::Schema) you can simple point to it, skipping schema_class and schema_args. You might for example be using Catalyst and want to build deployments for a database that is listed in configuration:
use MyCatalyst::App;
use DBIx::Class::Migration;
my $migration = DBIx::Class::Migration->new(
schema => MyCatalyst::App->model('Schema')->schema,
%{MyCatalyst::App->config->{extra_migration_init_args}};
);
target_dir_builder_class
Accepts: Str, Defaults to 'DBIx::Class::Migration::ShareDirBuilder' This is a class that is used as a helper to build "target_dir" should the user not provide a value. Default is DBIx::Class::Migration::ShareDirBuilder
target_dir_builder
An instance of whatever is in "target_dir_builder_class". Used by the lazy build method of "target_dir" to default a directory where the migrations are actually placed.
target_dir
Accepts Str. Required (lazy builds to your distribution /share
directory).
This is the directory we store our migration and fixture files. Inside this directory we will create a fixtures
and migrations
sub-directory.
Although you can specify the directory, if you leave it undefined, we will use File::ShareDir::ProjectDistDir to locate the /share
directory for your project and place the files there. This is the recommended approach, and is considered a community practice in regards to where to store your distribution non code files. Please see File::ShareDir::ProjectDistDir as well as File::ShareDir for more information.
This uses whatever is in "schema_class" to determine your project (and look for a share
directory, which you'll need to create in your project root). If you don't have a "schema_class" defined, you must have a "schema", and we'll infer the class via ref($self->schema)
.
NOTE: You'll need to make the /share
directory if you are going to use the default option. We don't automatically create it for you.
schema_loader_class
Accepts Str. Required
Because your application subclass of DBIx::Class::Schema is going to change a lot, sometimes we need to generate our own schema and get one that is in a known, good state. Mostly this is used by the commands to drop tables and clear tables.
Defaults to DBIx::Class::Migration::SchemaLoader. You'll probably only need to change this if your database is crazy and you need to massage the init arguments to DBIx::Class::Schema::Loader.
schema_loader
Accepts Object. Required but lazy builds.
This is a factory that provider autoloaded schema based on the current schema's database. It is automatically created and you are unlikely to need to set this manually.
dbic_fixture_class
Accepts Str. Required
This is the class we use when creating instances of DBIx::Class::Fixtures. You'll probably need to review the docs for that and understand how configuration rules work in order to best take advantage of the system.
Defaults to DBIx::Class::Fixtures. You'll probably not need to change this unless you have some unusual needs regarding fixtures.
dbic_fixtures_extra_args
Accepts HashRef. Required, but Defaults to Empty Hashref
Allows you to pass some additional arguments when creating instances of "dbic_fixture_class". These arguments can be used to override the default initial arguments.
deployment_handler_class
Accepts Str. Required
This is the class we use when creating instances of DBIx::Class::DeploymentHandler. It would be ideal that you review those docs in order to better understand the overall architecture of the system.
Defaults to DBIx::Class::DeploymentHandler. You'll probably not need to change this unless you need a custom deployment handler, and if you do, I can't be sure this framework will work correctly, particularly if you are not useing monotonic versioning.
dbic_dh_args
Accepts HashRef. Required and defaults to an empty hashref.
Used to pass custom args when building a DBIx::Class::DeploymentHandler. Please see the docs for that class for more. Useful args might be databases
, to_version
and force_overwrite
.
dbic_dh
Accepts Instance of DBIx::Class::DeploymentHandler. Required but lazily built from default data and dbic_dh_args.
You probably won't need to build your own deployment handler and pass it in (unlike schema, where it might actually be useful). Be careful it you do since this framework makes some assumptions about your deployment handler (for example we assume you are using the monotonic versioning).
When this attribute is lazily built, we merge "dbic_dh_args" with the following defaults:
schema => Points to $self->schema
script_directory => Points to catdir($self->target_dir, 'migrations')
databases => Inferred from your connected schema, defaults to SQLite
"dbic_dh_args" will overwrite the defaults, if you pass them.
METHODS
This class defines the following methods for public use
new
Used to create an new instance of DBIx::Class::Migration. There's a couple of paths to creating this instance.
Specify a schema_class and optionally schema_args
use DBIx::Class::Migration;
my $migration = DBIx::Class::Migration->new(
schema_class => 'MyApp::Schema',
schema_args => [@connect_info],
);
This is probably the most general approach, and is recommended unless you already have a connected instance of your DBIx::Class::Schema subclass.
"schema_args" would be anything you'd pass to "connect" in DBIx::Class::Schema. see "schema_args" for how we construct default connect information if you choose to leave this undefined.
Specify a schema
There may be some cases when you already have a schema object constructed and would prefer to just use that. For example, you may be using Catalyst and wish to build custom scripts using the built-in dependency and service lookup:
use MyCatalyst::App;
use DBIx::Class::Migration;
my $migration = DBIx::Class::Migration->new(
schema => MyCatalyst::App->model('Schema')->schema,
%{MyCatalyst::App->config->{extra_migration_init_args}};
);
Be careful of potential locking issues when using some databases like SQLite.
OPTIONAL: Specify a target_dir
Optionally, you can specify your migrations target directory (where your migrations get created), in your init arguments. This option can be combined with either approach listed above.
use DBIx::Class::Migration;
my $migration = DBIx::Class::Migration->new(
schema_class => 'MyApp::Schema',
schema_args => [@connect_info],
target_dir => '/opt/database-migrations',
);
If you leave this undefined we default to using the share
directory in your distribution root. This is generally the community supported place for non code data, but if you have huge fixture sets you might wish to place them in an alternative location.
OPTIONAL: Specify dbic_dh_args
Optionally, you can specify additional arguments to the constructor for the "dbic_dh" attribute. Useful arguments might include additional databases
we should build fixtures for, to_version
and force_overwrite
.
See "" in DBIx::Class::DeploymentHandler for more information on supported init arguments. See "dbic_dh" for how we merge default arguments with your custom arguments.
Other Initial Arguments
For normal usage the remaining init args are probably not particularly useful and reflect a desire for long term code flexibility and clean design.
version
Prints to STDOUT a message regarding the current application $VERSION
status
Returns the state of the deployed database (if it is deployed) and the state of the current schema
version. Sends this as a string to STDOUT
prepare
Creates a fixtures
and migrations
directory under "target_dir" (if they don't already exist) and makes deployment files for the current schema. If deployment files exist, will fail unless you "overwrite_migrations" and "overwrite_fixtures".
The migrations
directory reflects a directory structure as documented in DBIx::Class::DeploymentHandler.
If this is the first version, we create directories and initial DLL, etc. For versions greater than 1, we will also generate diffs and copy any fixture configs etc (as well as generating a fresh 'all_table.json' fixture config). For safety reasons, we never overwrite any fixture configs.
install
Installs either the current schema version (if already prepared) or the target version specified via any to_version
flags sent as an dbic_dh_args to the database which is connected via "schema"
upgrade
Run upgrade files to bring the database into sync with the current schema version.
downgrade
Run down files to bring the database down to the previous version from what is installed to the database
drop_tables
Drops all the tables in the connected database with no backup or recovery. For real! (Make sure you are not connected to Prod, for example).
delete_table_rows
Does a delete
on each table in the database, which clears out all your data but preserves tables. For Real! You might want this if you need to load and unload fixture sets during testing, or perhaps to get rid of data that accumulated in the database while running an app in development, before dumping fixtures.
dump_named_sets
Given an array of fixture set names, dump them for the current database version.
dump_all_sets
Takes no arguments just dumps all the sets we can find for the current database version.
make_schema
Given an existing database, reverse engineer a DBIx::Class Schema in the "target_dir" (under dumped_db
). You can use this if you need to bootstrap your DBIC files.
populate
Given an array of fixture set names, populate the current database version with the matching sets for that version.
Skips the table dbix_class_deploymenthandler_versions
, so you don't lose deployment info (this is different from "drop_tables" which does delete it.)
Experimental feature. Although not specifically a migration task, I find it useful to output visuals of my databases. This command will place a file in your "target_dir" called db-diagram-vXXX.png
where XXX
is he current schema
version.
This is using the Graphviz producer (SQL::Translator::Producer::GraphViz) which in turn requires Graphviz. Since this is not always trivial to install, I do not require it. You will need to add it manually to your Makefile.PL
or dist.ini
and manage it yourself.
This feature is experimental and currently does not offer any options, as I am still determining the best way to meet the need without exceeding the scope of DBIx::Class::Migration. Consider this command a 'freebee' and please don't depend on it in your custom code.
install_if_needed
If the database is not installed, do so. Accepts a hash of callbacks or instructions to perform should installation be needed/
$migration->install_if_needed(
on_install => sub {
my ($schema, $local_migration) = @_;
DBIx::Class::Migration::Population->new(
schema=>shift)->populate('all_tables');
});
The following callbacks / instructions are permitted
- on_install
-
Accepts: Coderef
Given a coderef, execute it after the database is installed. The coderef gets passed two arguments:
$schema
and$self
(the current migration object). - default_fixture_sets
-
Accepts: Arrayref of fixture sets
$migration->install_if_needed( default_fixture_sets => ['all_tables']);
After database installation, populate the fixtures in order.
install_version_storage
If the targeted (connected) database does not have the versioning tables installed, this will install them. The version is set to whatever your schema
version currently is.
You will only need to use this command in the case where you have an existing database that you are reverse engineering and you need to setup versioning storage since you can't rebuild the database from scratch (such as if you have a huge production database that you now want to start versioning).
THANKS
Because of the awesomeness of CPAN and the work of many others, all this functionality is provided with a few hundred lines of code. In fact, I spent a lot more time writing docs and tests than anything else. Here are some particular projects / people I'd like to thank:
First, thanks to mst
for providing me a big chunk of code that served to kickstart my work, and served as an valuable prototype.
Thanks to frew
for the awesome DBIx::Class::DeploymentHandler which gives us such a powerful base for organizing database versions. Thanks to all the authors of DBIx::Class::Fixtures for giving me a foundation for managing sets of data. Lastly, thanks to the DBIx::Class cabal for all the work done in making the DBIx::Class ORM so amazingly powerful.
Additionally thanks to the creators / maintainers for Test::mysqld and Test::postgresql, which made it easy to create developer level sandboxes for these popular open source databases.
As usual, thanks to the Moose cabal for making Perl programming fun and beautiful. Lastly, a shout-out to the Dist::Zilla cabal for making it so I don't need to write my own build and deployment tools.
AUTHOR
John Napiorkowski email:jjnapiork@cpan.org
SEE ALSO
DBIx::Class::DeploymentHandler, DBIx::Class::Fixtures, DBIx::Class, DBIx::Class::Schema::Loader, Moose, DBIx::Class::Migration::Script, DBIx::Class::Migration::Population, dbic-migration, SQL::Translator, Test::mysqld, Test::postgresql.
COPYRIGHT & LICENSE
Copyright 2012, John Napiorkowski email:jjnapiork@cpan.org
This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself.
1 POD Error
The following errors were encountered while parsing the POD:
- Around line 925:
Unknown directive: =head