Skip to main content

A collection of python functions for somebody's sanity

Project description

foc

foc

Func-Oriented Code or Francis' Odd Collection.

foc is a non-frilled and seamlessly integrated functional Python tool.

  • provides a collection of higher-order functions and placeholder lambda syntax (_)
  • provides an easy way to compose functions with symbols. (^ and |)

Install

$ pip install -U foc

Use

For more information, see the examples provided with each function.

>>> from foc import *

>>> (_ + 7)(3)  # (lambda x: x + 7)(3)
10

>>> 3 | (_ + 4) | (_ * 6)  # (3 + 4) * 6
42

>>> (length ^ range)(10)  # length(range(10))
10

>>> cf_(collect, filter(even), range)(10)  # (collect . filter(even . range)(10)
[0, 2, 4, 6, 8]

>>> ((_ * 5) ^ nth(3) ^ range)(5)  # range(5)[3] * 5
10

>>> cf_(sum, map(_ + 1), range)(10)  # sum(map((+1), [0..9]))
55

>>> map((_ * 3) ^ (_ + 2))(range(5)) | sum  #  sum(map((*3) . (+2)), [0..4])
60

>>> range(73, 82) | map(chr) | unchars  # unchar(map(chr, range(73, 82)))
'IJKLMNOPQ'

foc provides two ways to compose functions with symbols

If you don't like function composition using symbols, use cf_.
In fact, it's the most reliable and safe way to use it for all functions.

Symbol Description Evaluation order Available
^ (caret) same as dot(.) mathematical symbol backwards every (first only fx)
| (pipeline) in Unix pipeline manner in order fx-functions

fx-functions (or Function eXtension) are fx-decorated (@fx) functions.
If you want to make a function the fx function on the fly, just wrap the function in fx.

>>> 7 | fx(lambda x: x * 6) 
42

>>> @fx
... def fn(x, y):
...     ...

Partial application using placeholders is also possible when accessing items in dict, object or iterable.

Operator Equiv Function
_[_] op.getitem
_[item] op.itemgetter(item)
_._ getattr
_.attr op.attrgetter(attr)
# dict
>>> d = dict(one=1, two=2, three="three")
>>> (_[_])(d)("two")  # curry(lambda a, b: a[b])(d)("two")
2
>>> (_["one"])(d)  # (lambda x: x["one"])(d)
1
>>> cf_(_[2:4], _["three"])(d)  # d["three"][2:4]
're'

# iterable
>>> r = range(5)
>>> (_[_])(r)(3)  # curry(lambda a, b: a[b])(r)(3)
True
>>> (_[3])(r)  # (lambda x: x[3])(r)
True

# object
>>> o = type('', (), {"one": 1, "two": 2, "three": "three"})()
>>> (_._)(o)("two")  # curry(lambda a, b: getattr(a, b))(o)("two")
True
>>> (_.one)(o)  # (lambda x: x.one)(o)
True
>>> o | _.three | _[2:4]  # o.three[2:4]
're'

Real-World Example

A causal self-attention of the transformer model based on pytorch can be described as follows.
Somebody insists that this helps to follow the process flow without distraction. (plus, 3-5% speed-up)

    def forward(self, x):
        B, S, E = x.size()  # size_batch, size_block (sequence length), size_embed
        N, H = self.config.num_heads, E // self.config.num_heads  # E == (N * H)

        q, k, v = self.c_attn(x).split(self.config.size_embed, dim=2)
        q = q.view(B, S, N, H).transpose(1, 2)  # (B, N, S, H)
        k = k.view(B, S, N, H).transpose(1, 2)  # (B, N, S, H)
        v = v.view(B, S, N, H).transpose(1, 2)  # (B, N, S, H)

        # Attention(Q, K, V)
        #   = softmax( Q*K^T / sqrt(d_k) ) * V
        #         // q*k^T: (B, N, S, H) x (B, N, H, S) -> (B, N, S, S)
        #   = attention-prob-matrix * V
        #         // prob @ v: (B, N, S, S) x (B, N, S, H) -> (B, N, S, H)
        #   = attention-weighted value (attention score)

        return cf_(
            self.dropout,  # dropout of layer's output
            self.c_proj,  # linear projection
            g_(_.view)(B, S, E),  # (B, S, N, H) -> (B, S, E)
            torch.Tensor.contiguous,  # contiguos in-memory tensor
            g_(_.transpose)(1, 2),  # (B, S, N, H)
            _ @ v,  # (B, N, S, S) x (B, N, S, H) -> (B, N, S, H)
            self.dropout_attn,  # attention dropout
            f_(F.softmax, dim=-1),  # softmax
            g_(_.masked_fill)(mask == 0, float("-inf")),  # no-look-ahead
            _ / math.sqrt(k.size(-1)),  # / sqrt(d_k)
            _ @ k.transpose(-2, -1),  # Q @ K^T -> (B, N, S, S)
        )(q)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

foc-0.5.0.tar.gz (18.9 kB view details)

Uploaded Source

Built Distribution

foc-0.5.0-py3-none-any.whl (16.1 kB view details)

Uploaded Python 3

File details

Details for the file foc-0.5.0.tar.gz.

File metadata

  • Download URL: foc-0.5.0.tar.gz
  • Upload date:
  • Size: 18.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.14

File hashes

Hashes for foc-0.5.0.tar.gz
Algorithm Hash digest
SHA256 61c9b6151574bda2a6cb767516a76e5d0b0a5ff23c54fb1d55e8abd83225a190
MD5 d206775343254823a6f17a605e14d2f6
BLAKE2b-256 91f5afc0dfc5b975b690a641c9462b68dd29916fa1b63d087b803fed2228c0a5

See more details on using hashes here.

File details

Details for the file foc-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: foc-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 16.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.14

File hashes

Hashes for foc-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cba732ce1e805388b94850613bb48ca2fa0684db5b157d4efd9df5db7f0a0981
MD5 a584bc37c87bc7f05e4c6690e6aa8243
BLAKE2b-256 92172fa97860d27ed2df9472b2a6717a46bf3d68aa761e0eabd8730941b2be69

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page