Skip to main content

Stuff to do with counters, sequences and iterables.

Project description

Stuff to do with counters, sequences and iterables.

Latest release 20250306: New infill() and infill_from_batches() generators for identifying missing records requiring an infill.

Note that any function accepting an iterable will consume some or all of the derived iterator in the course of its function.

common_prefix_length(*seqs)

Return the length of the common prefix of sequences seqs.

common_suffix_length(*seqs)

Return the length of the common suffix of sequences seqs.

first(iterable)

Return the first item from an iterable; raise IndexError on empty iterables.

get0(iterable, default=None)

Return first element of an iterable, or the default.

greedy(g=None, queue_depth=0)

A decorator or function for greedy computation of iterables.

If g is omitted or callable this is a decorator for a generator function causing it to compute greedily, capacity limited by queue_depth.

If g is iterable this function dispatches it in a Thread to compute greedily, capacity limited by queue_depth.

Example with an iterable:

for packet in greedy(parse_data_stream(stream)):
    ... process packet ...

which does some readahead of the stream.

Example as a function decorator:

@greedy
def g(n):
    for item in range(n):
        yield n

This can also be used directly on an existing iterable:

for item in greedy(range(n)):
    yield n

Normally a generator runs on demand. This function dispatches a Thread to run the iterable (typically a generator) putting yielded values to a queue and returns a new generator yielding from the queue.

The queue_depth parameter specifies the depth of the queue and therefore how many values the original generator can compute before blocking at the queue's capacity.

The default queue_depth is 0 which creates a Channel as the queue - a zero storage buffer - which lets the generator compute only a single value ahead of time.

A larger queue_depth allocates a Queue with that much storage allowing the generator to compute as many as queue_depth+1 values ahead of time.

Here's a comparison of the behaviour:

Example without @greedy where the "yield 1" step does not occur until after the "got 0":

>>> from time import sleep
>>> def g():
...   for i in range(2):
...     print("yield", i)
...     yield i
...   print("g done")
...
>>> G = g(); sleep(0.1)
>>> for i in G:
...   print("got", i)
...   sleep(0.1)
...
yield 0
got 0
yield 1
got 1
g done

Example with @greedy where the "yield 1" step computes before the "got 0":

>>> from time import sleep
>>> @greedy
... def g():
...   for i in range(2):
...     print("yield", i)
...     yield i
...   print("g done")
...
>>> G = g(); sleep(0.1)
yield 0
>>> for i in G:
...   print("got", repr(i))
...   sleep(0.1)
...
yield 1
got 0
g done
got 1

Example with @greedy(queue_depth=1) where the "yield 1" step computes before the "got 0":

>>> from cs.x import X
>>> from time import sleep
>>> @greedy
... def g():
...   for i in range(3):
...     X("Y")
...     print("yield", i)
...     yield i
...   print("g done")
...
>>> G = g(); sleep(2)
yield 0
yield 1
>>> for i in G:
...   print("got", repr(i))
...   sleep(0.1)
...
yield 2
got 0
yield 3
got 1
g done
got 2

imerge(*iters, **kw)

Merge an iterable of ordered iterables in order.

Parameters:

  • iters: an iterable of iterators
  • reverse: keyword parameter: if true, yield items in reverse order. This requires the iterables themselves to also be in reversed order.

This function relies on the source iterables being ordered and their elements being comparable, through slightly misordered iterables (for example, as extracted from web server logs) will produce only slightly misordered results, as the merging is done on the basis of the front elements of each iterable.

infill(objs: Iterable[~_infill_T], *, obj_keys: Callable[[~_infill_T], ~_infill_K], existing_keys: Callable[[~_infill_T], ~_infill_K], all: Optional[bool] = False) -> Iterable[Tuple[~_infill_T, ~_infill_K]]

A generator accepting an iterable of objects which yields (obj,missing_keys) 2-tuples indicating missing records requiring infill for each object.

Parameters:

  • objs: an iterable of objects
  • obj_keys: a callable accepting an object and returning an iterable of the expected keys
  • existsing_keys: a callable accepting an object and returning an iterable of the existing keys
  • all: optional flag, default False: if true then yield (obj,()) for objects with no missing records

Example:

for obj, missing_key in infill(objs,...):
  ... infill a record for missing_key ...

infill_from_batches(objss: Iterable[Iterable[~_infill_T]], *, obj_keys: Callable[[~_infill_T], ~_infill_K], existing_keys: Callable[[~_infill_T], ~_infill_K], all: Optional[bool] = False, amend_batch: Optional[Callable[[Iterable[~_infill_T]], Iterable[~_infill_T]]] = <function <lambda> at 0x103c67ce0>)

A batched version of infill(objs) accepting an iterable of batches of objects which yields (obj,obj_key) 2-tuples indicating missing records requiring infill for each object.

This is aimed at processing batches of objects where it is more efficient to prepare each batch as a whole, such as a Django QuerySet which lets the caller make single database queries for a batch of Model instances. Thus this function can be used with cs.djutils.model_batches_qs for more efficient infill processing.

Parameters:

  • objss: an iterable of iterables of objects
  • obj_keys: a callable accepting an object and returning an iterable of the expected keys
  • existsing_keys: a callable accepting an object and returning an iterable of the existing keys
  • all: optional flag, default False: if true then yield (obj,()) for objects with no missing records
  • amend_batch: optional callable to amend the batch of objects, for example to amend a QuerySet with .select_related() or similar

isordered(items, reverse=False, strict=False)

Test whether an iterable is ordered. Note that the iterable is iterated, so this is a destructive test for nonsequences.

last(iterable)

Return the last item from an iterable; raise IndexError on empty iterables.

onetomany(func)

A decorator for a method of a sequence to merge the results of passing every element of the sequence to the function, expecting multiple values back.

Example:

  class X(list):
        @onetomany
        def chars(self, item):
              return item
  strs = X(['Abc', 'Def'])
  all_chars = X.chars()

onetoone(func)

A decorator for a method of a sequence to merge the results of passing every element of the sequence to the function, expecting a single value back.

Example:

  class X(list):
        @onetoone
        def lower(self, item):
              return item.lower()
  strs = X(['Abc', 'Def'])
  lower_strs = X.lower()

Class Seq

A numeric sequence implemented as a thread safe wrapper for itertools.count().

A Seq is iterable and both iterating and calling it return the next number in the sequence.

seq()

Return a new sequential value.

skip_map(func, *iterables, except_types, quiet=False)

A version of map() which will skip items where func(item) raises an exception in except_types, a tuple of exception types. If a skipped exception occurs a warning will be issued unless quiet is true (default False).

splitoff(sq, *sizes)

Split a sequence into (usually short) prefixes and a tail, for example to construct subdirectory trees based on a UUID.

Example:

>>> from uuid import UUID
>>> uuid = 'd6d9c510-785c-468c-9aa4-b7bda343fb79'
>>> uu = UUID(uuid).hex
>>> uu
'd6d9c510785c468c9aa4b7bda343fb79'
>>> splitoff(uu, 2, 2)
['d6', 'd9', 'c510785c468c9aa4b7bda343fb79']

Class StatefulIterator

A trivial iterator which wraps another iterator to expose some tracking state.

This has 2 attributes:

  • .it: the internal iterator which should yield (item,new_state)
  • .state: the last state value from the internal iterator

The originating use case is resuse of an iterator by independent calls that are typically sequential, specificly the .read method of file like objects. Naive sequential reads require the underlying storage to locate the data on every call, even though the previous call has just performed this task for the previous read. Saving the iterator used from the preceeding call allows the iterator to pick up directly if the file offset hasn't been fiddled in the meantime.

tee(iterable, *Qs)

A generator yielding the items from an iterable which also copies those items to a series of queues.

Parameters:

  • iterable: the iterable to copy
  • Qs: the queues, objects accepting a .put method.

Note: the item is .put onto every queue before being yielded from this generator.

the(iterable, context=None)

Returns the first element of an iterable, but requires there to be exactly one.

Class TrackingCounter

A wrapper for a counter which can be incremented and decremented.

A facility is provided to wait for the counter to reach a specific value. The .inc and .dec methods also accept a tag argument to keep individual counts based on the tag to aid debugging.

TODO: add strict option to error and abort if any counter tries to go below zero.

TrackingCounter.__init__(self, value=0, name=None, lock=None): Initialise the counter to value (default 0) with the optional name.

TrackingCounter.check(self): Internal consistency check.

TrackingCounter.dec(self, tag=None): Decrement the counter. Wake up any threads waiting for its new value.

TrackingCounter.inc(self, tag=None): Increment the counter. Wake up any threads waiting for its new value.

TrackingCounter.wait(self, value): Wait for the counter to reach the specified value.

unrepeated(it, seen=None, signature=None)

A generator yielding items from the iterable it with no repetitions.

Parameters:

  • it: the iterable to process
  • seen: an optional setlike container supporting in and .add()
  • signature: an optional signature function for items from it which produces the value to compare to recognise repeated items; its values are stored in the seen set

The default signature function is equality; the items are stored n seen and compared. This requires the items to be hashable and support equality tests. The same applies to whatever values the signature function produces.

Another common signature is identity: id, useful for traversing a graph which may have cycles.

Since seen accrues all the signature values for yielded items generally it will grow monotonicly as iteration proceeeds. If the items are complex or large it is well worth providing a signature function even if the items themselves can be used in a set.

Release Log

Release 20250306: New infill() and infill_from_batches() generators for identifying missing records requiring an infill.

Release 20250103: New skip_map(func, *iterables, except_types, quiet=False) generator function, like map() but skipping certain exceptions.

Release 20221118: Small doc improvement.

Release 20220530: Seq: calling a Seq is like next(seq).

Release 20210924: New greedy(iterable) or @greedy(generator_function) to let generators precompute.

Release 20210913: New unrepeated() generator removing duplicates from an iterable.

Release 20201025: New splitoff() function to split a sequence into (usually short) prefixes and a tail.

Release 20200914: New common_prefix_length and common_suffix_length for comparing prefixes and suffixes of sequences.

Release 20190103: Documentation update.

Release 20190101:

  • New and UNTESTED class StatefulIterator to associate some externally visible state with an iterator.
  • Seq: accept optional lock parameter.

Release 20171231:

  • Python 2 backport for imerge().
  • New tee function to duplicate an iterable to queues.
  • Function isordered() is now a test instead of an assertion.
  • Drop NamedTuple, NamedTupleClassFactory (unused).

Release 20160918:

  • New function isordered() to test ordering of a sequence.
  • imerge: accept new reverse parameter for merging reversed iterables.

Release 20160828: Modify DISTINFO to say "install_requires", fixes pypi requirements.

Release 20160827: TrackingCounter: accept presupplied lock object. Python 3 exec fix.

Release 20150118: metadata update

Release 20150111: Initial PyPI release.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cs_seq-20250306.tar.gz (11.3 kB view details)

Uploaded Source

Built Distribution

cs_seq-20250306-py3-none-any.whl (12.3 kB view details)

Uploaded Python 3

File details

Details for the file cs_seq-20250306.tar.gz.

File metadata

  • Download URL: cs_seq-20250306.tar.gz
  • Upload date:
  • Size: 11.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.7

File hashes

Hashes for cs_seq-20250306.tar.gz
Algorithm Hash digest
SHA256 2bede103ee078f808bb84b8c7de4d528352081335659faca7ad253fd4921e2ce
MD5 88949277784ecead0b7b4d68ac3fd6bf
BLAKE2b-256 4e175f5255979c167aaf56616eef6d601d5f096dcfed3f1311f1a90318c4b65d

See more details on using hashes here.

File details

Details for the file cs_seq-20250306-py3-none-any.whl.

File metadata

  • Download URL: cs_seq-20250306-py3-none-any.whl
  • Upload date:
  • Size: 12.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.7

File hashes

Hashes for cs_seq-20250306-py3-none-any.whl
Algorithm Hash digest
SHA256 78e918a50a91149be5b9e401788caf53962e66a2ce6d30f69f482e075dbb9521
MD5 418ff9287c9267ca076915edf85b2c1f
BLAKE2b-256 8876b6f0c79b995fae9752172cc95dd311be771346570732105d4c3626055997

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page