Skip to main content

GSA integration for external indexing and searching

Project description

Introduction

Package collective.gsa integrates Plone site with a Google Search Appliance (GSA). It provides an indexing processor with collective.indexing as well as a search capabilities.

collective.gsa is tested and runs only with Plone 3.x at the moment. To run collective.gsa with Plone 4.x you have to modify the search template to follow Plone 4 requirements.

Installation

Add collective.gsa to your buildout.cfg to both eggs and zcml section:

[buildout]

eggs = collective.gsa

[instance]
zcml =
    collective.gsa
    collective.gsa-overrides

After running buildout and restarting the server, you can install it via Quick Installer either ZMI or Plone Add/Remove Products. After installing the package the GSA settings and GSA maintenance configlets will appear in the Plone Control Panel. Follow the fields’ description to set it up.

Global reindex

In the GSA maintenance configlet there is a tool to globally reindex the whole site. If the site is a large one, memory related issues may appear. Thus the reindex allows you to run it piece by piece by batching the objects.

If it is more suitable to run rather more small batches then there is an example script global_reindex.py in the example folder which runs the batch reindexes repeatedly.

Indexing

Package collective.gsa registers adapter for IQueueIndexProcessor and indexing is done via collective.indexing package. When object is reindexed the content provider adapter is called to obtain the data.

The package contains content providers for objects implementing IATDocument, IATFile and IATContentType.
  • For document CTs (Page, News Items etc.) the main macro ( usually the site without portlets and the header).

  • For file CTs the primary file field is sent.

  • For other archetype based CTs the title and description.

To create support for other types just create your own content provider implementing interface IContentProvider and register it via zcml. For details look at the content_provider module and gsa’s configure.zcml

The package supports dual indexing if you have two sites - e.g. secure for edit access and public for anonymous access. The object’s identifier in GSA is its url which is obtained using object’s absolute_url method. Thus all the indexing has to be done from the url you want it to be indexed for ( e.i. not from localhost). In the GSA’s control panel you can set a dual base url for anonymous site. Then the url is constructed using the dual url plus absolute_url_path method.

When reindexing object, the feed id added to a persistent queue and is removed when successfully sent to GSA hence if GSA is unreachable the feed will be send when another object is reindexed.

Fact that GSA received the feed does not mean that it is going to be indexed ( e.i. the url is not in the Matched URLs settings ) If your objects are not indexed, please, check the GSA’s Crawl and Index settings.

Searching

This package replaces the search template and livesearch script to use GSA as a search engine. This is done by adding a gsasearch=on into the search request to avoid using GSA search for internal searches ( such as navigation, folder contents etc. )

The plone’s advanced search is at the default search_form template and does not use GSA at all, because GSA does not handle indexes as zope’s ZCatalog does. However you can use the GSA’s advanced search which url you can set at the local GSA control panel.

Uninstall

To remove collective.gsa just uninstall it via QuickInstaller and remove it from buildout.

Current Status

The basic implementation is nearly finished and we aim to write the neccessary tests for it.

Credit

This code was inspired by collective.solr package and it was kindly sponsored by University of Leicester.

Changelog

1.0.9 - 2011-01-25

  • do not use ?searchterm= if url is not in the portal

1.0.8 - 2011-01-21

  • fixed keyword error while not doing gsa search

1.0.7 - 2010-11-11

  • use default view for all archetypes based objects

  • added client’s frontend to settings

  • added ConflictError bypass to all-catching exceptions

  • fix login issues with GSA 6.4.0

  • when GSA does not accept auth cookie, return public search and display info viewlet to the user

1.0.6 - 2010-06-21

  • fix issue when adding searchterm to results url plone did not expect the results to have additional parameters - hardcoded ‘?’ which was then duplicate

  • changed some info logging into debug level

1.0.5 - 2009-08-11

  • added filter on bad characters - fixes GSA not finishing indexing

  • add uninstall method

1.0.4 - 2009-07-13

  • added render method to overriden searchbox viewlet - fixes compatibility with 3.1.2

  • remove mechanize from required packages ( zope2 has its own with different version)

1.0.3 - 2009-06-24

  • Included global_reindex script to run the reindex (by Steven Hayles)

  • ‘Start over’ button at gsa-maintenance view only resets the ‘already reindexed objects’ number

1.0.2 - 2009-06-08

  • Removed rank from LiveSearch if zero

  • Filtering creators only to the existing in the current plone instance

  • When reindexing files, commiting after one

  • Added straight option for GSA indexer to skip persistent queue ( for global reindex )

1.0.1 - 2009-05-29

  • Added better Memory error handling

1.0 - Initial release

  • Initial release

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

collective.gsa-1.0.9.zip (78.1 kB view details)

Uploaded Source

File details

Details for the file collective.gsa-1.0.9.zip.

File metadata

  • Download URL: collective.gsa-1.0.9.zip
  • Upload date:
  • Size: 78.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for collective.gsa-1.0.9.zip
Algorithm Hash digest
SHA256 8dda3f5711c88e67b87cba80b20e361454848adf665aa5c7c56eee878a839323
MD5 056e9325ee09bd7db6bf3398481aebe9
BLAKE2b-256 338bca56d92191b64caee5b0ad49a8314452b15f6ad30a932a79490ee7af736f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page