Skip to main content

A collection of python functions for somebody's sanity

Project description

foc

foc

Func-Oriented Code or Francis' Odd Collection.

foc is a non-frilled and seamlessly integrated functional Python tool.

  • provides a collection of higher-order functions and placeholder lambda syntax (_)
  • provides an easy way to compose functions with symbols. (^ and |)

The collection of utilities contained in previous versions has been separated into a new project.

Install

$ pip install -U foc

Use

For more examples, see the documentation provided with each function.

>>> from foc import *

>>> (_ + 7)(3)  # (lambda x: x + 7)(3)
10

>>> 3 | (_ + 4) | (_ * 6)  # (3 + 4) * 6
42

>>> (length ^ range)(10)  # length(range(10))
10

>>> cf_(rev, filter(even), range)(10)  # (rev ^ filter(even) ^ range)(10)
[8, 6, 4, 2, 0]

>>> ((_ * 5) ^ nth(3) ^ range)(5)  # range(5)[3] * 5
10

>>> cf_(sum, map(_ + 1), range)(10)  # sum(map(_ + 1, range(10)))
55

>>> range(5) | map((_ * 3) ^ (_ + 2)) | sum  # sum(map(cf_(_*3, _+2), range(5)))
60

>>> range(73, 82) | map(chr) | unchars  # unchars(map(chr, range(73, 82)))
'IJKLMNOPQ'

What is fx?

fx (Function eXtension) is the backbone of foc and provides a new syntax when composing functions.
Technically, fx maps every function in Python to a monadic function in fx monad.
In fact, fx is a lift function, but here, the functions generated by fx are also expressed as fx.

1. fx is a composable function using symbols.

There are two ways to compose functions with symbols as shown in the previous section.

Symbol Description Evaluation Order
^ (caret) same as dot(.) mathematical symbol Right-to-Left
| (pipeline) in Unix pipeline manner Left-to-Right

If you don't like function composition using symbols, use cf_.
In fact, it's the most reliable and safe way to use it for all functions.

2. fx is really easy to make.

fx is just a function decorated by @fx.
Wrap any function in fx when you need function composition on the fly.

>>> [1, 2, 3] | sum | (lambda x: x * 7)    # error, lambda is not a 'fx'
TypeError: unsupported operand ...

>>> [1, 2, 3] | sum | fx(lambda x: x * 7)  # just wrap it in 'fx'.
42

>>> @fx
... def func(arg):    # apply to function definition or bind 'g = fx(func)'
...     ...           # 'func' is now 'composable' with symbols

Most of the functions provided by foc are fx functions.
If you don't have one, you can just create one and use it.

3. fx is a curried function.

# map := map(predicate, iterable)
# currying 'map' -> map(predicate)(iterable)
>>> map(_ * 8)(rg(1,...)) | take(5)    # rg(1,...) == [1,2,3,..], 'infinite' sequence
[8, 16, 24, 32, 40]                    # rg(1,3,...) == [1,3,5,..]
                                       # rg(1,4,...,11) == [1,4,7,10]

# bimap := bimap(f, g, tuple)
# bimap(f, g) := first(f) ^ second(g)  # map over both 'first' and 'second' argument
>>> bimap(_ + 3)(_ * 7)((5, 7))
(8, 49)
>>> (5, 7) | bimap(_ + 3)(_ * 7)
(8, 49)

>>> filterl(_ == "f")("fun-on-functions")    # filterl == (filter | collect)
['f', 'f']
>>> foldl(op.sub)(10)(range(1, 5))
0

@fx
def args(a, b, c, d):
    return f"{a}-{b}-{c}-{d}"

>>> args(1)(2)(3)(4) == args(1,2)(3,4) == args(1,2,3)(4) == args(1)(2,3,4) == args(1,2,3,4)
True

You can get the curried function of g with fx(g).
But if you want to get a curried function other than fx, use curry(g).

4. Lambdas with _ are fx.

>>> [1, 2, 3] | sum | (_ * 7)    # Use '_' lambda instead.
42
>>> ((_ * 6) ^ (_ + 4))(3)       # (3 + 4) * 6
42
>>> 2 | (_ * 7) | (60 % _) | (_ // 3)   # (60 % (2 * 7)) // 3
1

Partial application driven by _ is also possible when accessing dict, object or iterable, or even calling functions. How about using _(_) as a curried function caller?

Operator Equiv Function
_[_] op.getitem
_[item] op.itemgetter(item)
_._ getattr
_.attr op.attrgetter(attr)
_(_) apply
_(*a, **k) lambda f: f(*a, **k)
# dict
>>> d = dict(one=1, two=2, three="three")
>>> (_[_])(d)("two")  # curry(lambda a, b: a[b])(d)("two")
2
>>> (_["one"])(d)  # (lambda x: x["one"])(d)
1
>>> cf_(_[2:4], _["three"])(d)  # d["three"][2:4]
're'

# iterable
>>> r = range(5)
>>> (_[_])(r)(3)  # curry(lambda a, b: a[b])(r)(3)
True
>>> (_[3])(r)  # (lambda x: x[3])(r)
True

# object
>>> o = type('', (), {"one": 1, "two": 2, "three": "three"})()
>>> (_._)(o)("two")  # curry(lambda a, b: getattr(a, b))(o)("two")
True
>>> (_.one)(o)  # (lambda x: x.one)(o)
True
>>> o | _.three | _[2:4]  # o.three[2:4]
're'

# function caller 
>>> _(_)(foldl)(op.add)(0)(range(5))
10
>>> _(7 * _)(mapl)(range(1, 10))
[7, 14, 21, 28, 35, 42, 49, 56, 63]

# Not seriously, this creates multiplication table.
>>> [ mapl(f)(range(1, 10)) for f in _(_ * _)(map)(range(1, 10)) ]

Don't forget that foc is a collection, albeit very odd.

To see all the functions provided by foc, run catalog().

Everything in one place.

  • fx pure basic functions id, const, seq, take, drop, repeat, replicate..
  • higher-order functions like f_, g_, curry, uncurry, flip, map, filter, zip,..
  • function composition tools like cf_, cfd_, ..
  • useful yet very fundamental like rg, force, trap, error, guard,..

Real-World Example

A causal self-attention of the transformer model based on pytorch can be described as follows.
Somebody insists that this helps to follow the process flow without distraction. (plus, 3-5% speed-up)

    def forward(self, x):
        B, S, E = x.size()  # size_batch, size_block (sequence length), size_embed
        N, H = self.config.num_heads, E // self.config.num_heads  # E == (N * H)

        q, k, v = self.c_attn(x).split(self.config.size_embed, dim=2)
        q = q.view(B, S, N, H).transpose(1, 2)  # (B, N, S, H)
        k = k.view(B, S, N, H).transpose(1, 2)  # (B, N, S, H)
        v = v.view(B, S, N, H).transpose(1, 2)  # (B, N, S, H)

        # Attention(Q, K, V)
        #   = softmax( Q*K^T / sqrt(d_k) ) * V
        #         // q*k^T: (B, N, S, H) x (B, N, H, S) -> (B, N, S, S)
        #   = attention-prob-matrix * V
        #         // prob @ v: (B, N, S, S) x (B, N, S, H) -> (B, N, S, H)
        #   = attention-weighted value (attention score)

        return cf_(
            self.dropout,  # dropout of layer's output
            self.c_proj,  # linear projection
            g_(_.view)(B, S, E),  # (B, S, N, H) -> (B, S, E)
            torch.Tensor.contiguous,  # contiguos in-memory tensor
            g_(_.transpose)(1, 2),  # (B, S, N, H)
            _ @ v,  # (B, N, S, S) x (B, N, S, H) -> (B, N, S, H)
            self.dropout_attn,  # attention dropout
            f_(F.softmax, dim=-1),  # softmax
            g_(_.masked_fill)(mask == 0, float("-inf")),  # no-look-ahead
            _ / math.sqrt(k.size(-1)),  # / sqrt(d_k)
            _ @ k.transpose(-2, -1),  # Q @ K^T -> (B, N, S, S)
        )(q)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

foc-0.5.2.tar.gz (21.9 kB view details)

Uploaded Source

Built Distribution

foc-0.5.2-py3-none-any.whl (17.6 kB view details)

Uploaded Python 3

File details

Details for the file foc-0.5.2.tar.gz.

File metadata

  • Download URL: foc-0.5.2.tar.gz
  • Upload date:
  • Size: 21.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.14

File hashes

Hashes for foc-0.5.2.tar.gz
Algorithm Hash digest
SHA256 c8aa3c33d0f599309404167b486c94fb7d9b7f9ce21d2bfd2940085a516fddab
MD5 967f8f02830165a8a0c28c430b532e41
BLAKE2b-256 4cd8a66af40a549190a59dcc532c128545f8807d2b0e9f4495183807bfbf05b5

See more details on using hashes here.

File details

Details for the file foc-0.5.2-py3-none-any.whl.

File metadata

  • Download URL: foc-0.5.2-py3-none-any.whl
  • Upload date:
  • Size: 17.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.14

File hashes

Hashes for foc-0.5.2-py3-none-any.whl
Algorithm Hash digest
SHA256 bc201b5ac0bce25266ddd6081d1caf7fa830978cdb467c4608fcbc032a64a32b
MD5 57dfb6cb4ceec188f75f1c6f977dbae7
BLAKE2b-256 c099de0e1a30fa82b25c0c9862ec4b23f42d1e685aa81024711a60c3f3bdd3e9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page