Export & Import
Learn how to export and import your Query-2jz data in various formats using the CLI.
Export Overview
Export your Query-2jz data in multiple formats for analysis, migration, or backup purposes.
Export Formats
- • JSON: Human-readable format with full data structure
- • CSV: Spreadsheet-compatible format for data analysis
- • SQL: Database-agnostic SQL format for migration
- • XML: Structured format for enterprise systems
- • YAML: Human-readable configuration format
Export Types
- • Data Export: Export actual data from your models
- • Schema Export: Export database structure and relationships
- • Model Export: Export specific models with relationships
- • Query Export: Export results from custom queries
- • Incremental Export: Export only changed data since last export
Data Export
Export your data in various formats with filtering and customization options.
Data Export Commands
Commands for exporting data
Basic Data Export
# Export all data to JSON
query-2jz export data --format json --output ./exports/data.json
# Export all data to CSV
query-2jz export data --format csv --output ./exports/data.csv
# Export all data to SQL
query-2jz export data --format sql --output ./exports/data.sql
# Export with pretty formatting
query-2jz export data --format json --pretty --output ./exports/data.json
# Export with compression
query-2jz export data --format json --compress --output ./exports/data.json.gzModel-Specific Export
# Export specific models
query-2jz export data --models User,Post,Comment --format json
# Export models with relationships
query-2jz export data --models User --include-relations --format json
# Export models with specific fields
query-2jz export data --models User --fields id,name,email --format json
# Export models with filters
query-2jz export data --models User --where '{"status":"active"}' --format json
# Export models with sorting
query-2jz export data --models User --order-by "createdAt:desc" --format jsonFiltered Export
# Export with date range
query-2jz export data --since 2024-01-01 --until 2024-12-31 --format json
# Export with custom query
query-2jz export data --query "SELECT * FROM users WHERE status = 'active'" --format json
# Export with limit
query-2jz export data --limit 1000 --format json
# Export with offset
query-2jz export data --offset 100 --limit 1000 --format json
# Export with search
query-2jz export data --search "john" --format jsonAdvanced Export Options
# Export with metadata
query-2jz export data --include-metadata --format json
# Export with timestamps
query-2jz export data --include-timestamps --format json
# Export with validation
query-2jz export data --validate --format json
# Export with progress tracking
query-2jz export data --progress --format json
# Export with batch processing
query-2jz export data --batch-size 1000 --format json
# Export with custom delimiter (CSV)
query-2jz export data --format csv --delimiter ";" --output ./exports/data.csvSchema Export
Export your database schema and model definitions.
Schema Export Commands
# Export complete schema
query-2jz export schema --output ./exports/schema.json
# Export schema to SQL
query-2jz export schema --format sql --output ./exports/schema.sql
# Export schema to YAML
query-2jz export schema --format yaml --output ./exports/schema.yaml
# Export specific models schema
query-2jz export schema --models User,Post --output ./exports/schema.json
# Export schema with relationships
query-2jz export schema --include-relations --output ./exports/schema.json
# Export schema with indexes
query-2jz export schema --include-indexes --output ./exports/schema.jsonSchema Export Options
# Export schema with constraints
query-2jz export schema --include-constraints --output ./exports/schema.json
# Export schema with triggers
query-2jz export schema --include-triggers --output ./exports/schema.json
# Export schema with views
query-2jz export schema --include-views --output ./exports/schema.json
# Export schema with functions
query-2jz export schema --include-functions --output ./exports/schema.json
# Export schema with comments
query-2jz export schema --include-comments --output ./exports/schema.json
# Export schema with data types
query-2jz export schema --include-types --output ./exports/schema.jsonData Import
Import data from various sources and formats into your Query-2jz database.
Basic Data Import
# Import from JSON file
query-2jz import data --format json --input ./imports/data.json
# Import from CSV file
query-2jz import data --format csv --input ./imports/data.csv
# Import from SQL file
query-2jz import data --format sql --input ./imports/data.sql
# Import with validation
query-2jz import data --validate --input ./imports/data.json
# Import with dry run
query-2jz import data --dry-run --input ./imports/data.json
# Import with progress tracking
query-2jz import data --progress --input ./imports/data.jsonImport Options
# Import with batch processing
query-2jz import data --batch-size 1000 --input ./imports/data.json
# Import with error handling
query-2jz import data --continue-on-error --input ./imports/data.json
# Import with rollback on error
query-2jz import data --rollback-on-error --input ./imports/data.json
# Import with logging
query-2jz import data --log-file ./import.log --input ./imports/data.json
# Import with custom delimiter (CSV)
query-2jz import data --format csv --delimiter ";" --input ./imports/data.csv
# Import with encoding specification
query-2jz import data --encoding utf-8 --input ./imports/data.jsonSelective Import
# Import specific models
query-2jz import data --models User,Post --input ./imports/data.json
# Import with field mapping
query-2jz import data --field-mapping ./mapping.json --input ./imports/data.json
# Import with data transformation
query-2jz import data --transform ./transform.js --input ./imports/data.json
# Import with filters
query-2jz import data --where '{"status":"active"}' --input ./imports/data.json
# Import without overwriting existing data
query-2jz import data --skip-existing --input ./imports/data.json
# Import with update existing data
query-2jz import data --update-existing --input ./imports/data.jsonSchema Import
Import database schema and model definitions.
Schema Import Commands
# Import schema from JSON
query-2jz import schema --format json --input ./imports/schema.json
# Import schema from SQL
query-2jz import schema --format sql --input ./imports/schema.sql
# Import schema from YAML
query-2jz import schema --format yaml --input ./imports/schema.yaml
# Import schema with validation
query-2jz import schema --validate --input ./imports/schema.json
# Import schema with dry run
query-2jz import schema --dry-run --input ./imports/schema.json
# Import schema with backup
query-2jz import schema --backup --input ./imports/schema.jsonSchema Import Options
# Import schema with constraints
query-2jz import schema --include-constraints --input ./imports/schema.json
# Import schema with indexes
query-2jz import schema --include-indexes --input ./imports/schema.json
# Import schema with triggers
query-2jz import schema --include-triggers --input ./imports/schema.json
# Import schema with views
query-2jz import schema --include-views --input ./imports/schema.json
# Import schema with functions
query-2jz import schema --include-functions --input ./imports/schema.json
# Import schema with data types
query-2jz import schema --include-types --input ./imports/schema.jsonMigration Tools
Tools for migrating data between different databases and formats.
Database Migration
# Migrate from MySQL to PostgreSQL
query-2jz migrate --from mysql --to postgresql --source-db mysql://... --target-db postgresql://...
# Migrate from SQLite to PostgreSQL
query-2jz migrate --from sqlite --to postgresql --source-db ./old.db --target-db postgresql://...
# Migrate with data transformation
query-2jz migrate --from mysql --to postgresql --transform ./migration.js
# Migrate with validation
query-2jz migrate --from mysql --to postgresql --validate
# Migrate with dry run
query-2jz migrate --from mysql --to postgresql --dry-run
# Migrate with progress tracking
query-2jz migrate --from mysql --to postgresql --progressData Migration
# Migrate data with field mapping
query-2jz migrate data --field-mapping ./mapping.json --source ./source.json --target ./target.json
# Migrate data with transformation
query-2jz migrate data --transform ./transform.js --source ./source.json --target ./target.json
# Migrate data with validation
query-2jz migrate data --validate --source ./source.json --target ./target.json
# Migrate data with batch processing
query-2jz migrate data --batch-size 1000 --source ./source.json --target ./target.json
# Migrate data with error handling
query-2jz migrate data --continue-on-error --source ./source.json --target ./target.jsonExport/Import Configuration
Configure export and import settings in your Query-2jz configuration file.
// query-2jz.config.js
module.exports = {
export: {
// Default export settings
defaultFormat: 'json',
defaultPath: './exports',
compression: {
enabled: false,
algorithm: 'gzip',
level: 6
},
// Export formats configuration
formats: {
json: {
pretty: true,
includeMetadata: true,
includeTimestamps: true
},
csv: {
delimiter: ',',
includeHeaders: true,
encoding: 'utf-8'
},
sql: {
includeSchema: true,
includeData: true,
includeIndexes: true
}
},
// Export filters
filters: {
defaultLimit: 10000,
maxLimit: 100000,
allowedModels: ['User', 'Post', 'Comment']
}
},
import: {
// Default import settings
defaultPath: './imports',
validation: {
enabled: true,
strict: false
},
// Import options
options: {
batchSize: 1000,
continueOnError: false,
rollbackOnError: true,
skipExisting: false,
updateExisting: false
},
// Import formats configuration
formats: {
json: {
validateSchema: true,
strictMode: false
},
csv: {
delimiter: ',',
encoding: 'utf-8',
skipEmptyLines: true
},
sql: {
validateSchema: true,
includeConstraints: true
}
}
}
};Best Practices
Do
- • Validate data before importing
- • Use dry-run mode for testing
- • Backup data before major imports
- • Use batch processing for large datasets
- • Monitor import/export progress
- • Use appropriate file formats for your use case
- • Document import/export procedures
Don't
- • Skip data validation
- • Import large datasets without batching
- • Ignore error messages during import
- • Use production data for testing
- • Skip backup before major operations
- • Ignore file format limitations
- • Skip progress monitoring for long operations