Skip to main content

synchronize content between plone instance

Project description

A tool for synchronize content beetwen plone instance

How to install ?

o In plone3 go to portal_quickinstaller and install collective.synchro

o In plone2.5 go to portal_setup , in properties select collective synchro as active configuration and in import click to Import all steps

How it work ?

This tool import content via plugin. By default there is three plugins : fss, zexp and delete You can add and write new plugin (see PLUGIN.txt) for your use case

It store exported data in file system in order to be reimported in an another instance (or multiple instance). An external program must copy data to an import queue. This egg provide a scripts for ssh transport. Add in your buildout (example of buildout are available in collective/synchro/buildout directory) this part to configure scripts:

[synchro]
recipe = zc.recipe.egg
eggs =
   collective.synchro

extra-paths=
   ${instance:location}
   ${zope2:location}/lib/python

This create in your bin of buildout three scripts :

o bin/create_queue -d PATH, –directory=PATH , create a queue structure

o bin/synchronize_queue -s SOURCE -d DEST , synchronize queue by ssh (via ssh key) (-h for other options)

o bin/import_queue -p PATH , import data from queue (-h for other options)

Important, the import_queue must be run with an zeoclient instance:

>>> bin/instance run bin/import_queue ...

The synchronization is fire by the zope3 event system. There is a generic method call by all events : collective.synchro.events.synchro.

Plugin are choice by event (plugin list for ObjectModified and ObjectRemovedEvent are different). This registry is managed by the synchronization_tool

The localisation of import is done by queyring a multiadapter that decide where the content is reimported. You can change by zca, the localisation of your import content.

The structure of a queue looks like as this:

./IMPORT
./IMPORT/TO_PROCESS -> all files in order to process
./IMPORT/DONE -> file that are synchronized with an EXPORT queue
./IMPORT/ERROR -> data in error
./IMPORT/PROCESSING -> files are synchronizing with an EXPORT queue
./EXPORT
./EXPORT/TO_PROCESS -> files that are scheduled to export
./EXPORT/DONE -> files that are imported in the instance
./EXPORT/ERROR -> files in error
./EXPORT/PROCESSING -> files are importing from queue

How to configure the export ?

Go to zmi, in portal_synchronisation configure:

o queues : filesystem path (create if don't exits, one queue for one instance)

o expressions : a tal expression that must be true to synchronized content

Compatibility

This package is tested in plone2.5 and plone3.1

TODO

o interface to register/unregister plugins in zmi

o callback for imported content

Changelog

1.0.2 - Plone2.5.2 compliant

  • test on Plone2.5.2 with Five 1.4.2

  • reindex object after import

  • fix fss import for Five 1.4.2

1.0.1 - Initial

  • Add script to synchronized and import content

  • Fix bug in delete plugin (see export.txt)

1.0.0 - Unreleased

  • Initial release

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

collective.synchro-1.0.2.tar.gz (36.6 kB view details)

Uploaded Source

Built Distribution

collective.synchro-1.0.2-py2.4.egg (94.1 kB view details)

Uploaded Source

File details

Details for the file collective.synchro-1.0.2.tar.gz.

File metadata

File hashes

Hashes for collective.synchro-1.0.2.tar.gz
Algorithm Hash digest
SHA256 075255ada5a2171903f6f40fb0b960b8357c6ffe25d8ca821539246a0450ac81
MD5 3037403760d3d9d3d66c3419f7bf8251
BLAKE2b-256 1ee50826e1a9f2608137d9f9475d81883afdc582cca9318c9038b40428348cd7

See more details on using hashes here.

File details

Details for the file collective.synchro-1.0.2-py2.4.egg.

File metadata

File hashes

Hashes for collective.synchro-1.0.2-py2.4.egg
Algorithm Hash digest
SHA256 b0ae54d5abcd7fe6f7aaefc46d212cbe6c9f375fea5aa46c0db58ef61c9e72c2
MD5 b7e4f173ce6e2682f15a894adcef4c36
BLAKE2b-256 e454de6eb05c48d4ad2cc30e69f68838ca66a35189a5cf72a76bb2b6a4662b53

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page