Skip to content

Faster method for migrating data in batches #20

@tompollard

Description

@tompollard

The current approach (_copy_data) for loading data from the source database relies on limit + offset, which is slow. Replace it with a faster approach, perhaps using yield_per or a window function. e.g.:

        for data in source_session.query(table).yield_per(batchsize):
            _insert_data(target_session,source_schema,table,data)

See:

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions