================
Scan search Type
================

Note: this test will start and run an elasticsearch server on port 45299!

Let's just do some simple tests without to use a connection pool.

  >>> from pprint import pprint
  >>> from p01.elasticsearch.connection import ElasticSearchConnection
  >>> from p01.elasticsearch.exceptions import ElasticSearchServerException
  >>> from p01.elasticsearch.pool import ServerPool

  >>> servers = ['localhost:45299']
  >>> serverPool = ServerPool(servers)

Now we are able to get a connection which is persistent and observed by a 
thread local. 

  >>> conn = ElasticSearchConnection(serverPool)

Setup a test mapping and add a few documents:

  >>> conn.createIndex('scanning')
  {u'acknowledged': True, u'ok': True}

  >>> for i in range(1000):
  ...     _id = unicode(i)
  ...     doc = {'_id': _id, 'dummy': u'dummy'}
  ...     ignored = conn.index(doc, 'scanning', 'doc')

  >>> conn.refresh('scanning')
  {u'ok': True, u'_shards': {u'successful': 5, u'failed': 0, u'total': 10}}


scan
----

Let's show how we can batch large search results with our scan method.

  >>> pprint(conn.search('dummy', 'scanning').total)
  1000

  >>> result = list(conn.scan('dummy', 'scanning'))
  >>> len(result)
  1000

  >>> pprint(sorted(result)[:5])
  [{u'_id': u'0',
    u'_index': u'scanning',
    u'_score': 0.0,
    u'_source': {u'_id': u'0', u'dummy': u'dummy'},
    u'_type': u'doc'},
   {u'_id': u'1',
    u'_index': u'scanning',
    u'_score': 0.0,
    u'_source': {u'_id': u'1', u'dummy': u'dummy'},
    u'_type': u'doc'},
   {u'_id': u'10',
    u'_index': u'scanning',
    u'_score': 0.0,
    u'_source': {u'_id': u'10', u'dummy': u'dummy'},
    u'_type': u'doc'},
   {u'_id': u'100',
    u'_index': u'scanning',
    u'_score': 0.0,
    u'_source': {u'_id': u'100', u'dummy': u'dummy'},
    u'_type': u'doc'},
   {u'_id': u'101',
    u'_index': u'scanning',
    u'_score': 0.0,
    u'_source': {u'_id': u'101', u'dummy': u'dummy'},
    u'_type': u'doc'}]
