Skip to main content

A collection of python functions for somebody's sanity

Project description

foc

foc

Func-Oriented Code or Francis' Odd Collection.

foc is a non-frilled and seamlessly integrated functional Python tool.

  • provides a collection of higher-order functions and placeholder lambda syntax (_)
  • provides an easy way to compose functions with symbols. (^ and |)

The collection of utilities contained in previous versions has been separated into a new project.

Install

$ pip install -U foc

Use

For more examples, see the documentation provided with each function.

>>> from foc import *

>>> (_ + 7)(3)  # (lambda x: x + 7)(3)
10

>>> 3 | _ + 4 | _ * 6  # (3 + 4) * 6
42

>>> (length ^ range)(10)  # length(range(10))
10

>>> cf_(rev, filter(even), range)(10)  # rev(filter(even)(range(10)))
[8, 6, 4, 2, 0]

>>> ((_ * 5) ^ nth(3) ^ range)(5)  # range(5)[3] * 5
10

>>> cf_(sum, map(_ + 1), range)(10)  # sum(map(_ + 1, range(10)))
55

>>> range(5) | map((_ * 3) ^ (_ + 2)) | sum  # sum(map(lambda x: (x + 2) * 3, range(5)))
60

>>> range(73, 82) | map(chr) | unchars  # unchars(map(chr, range(73, 82)))
'IJKLMNOPQ'

What is fx?

fx (Function eXtension) is the backbone of foc and provides a new syntax when composing functions.
Technically, fx maps every function in Python to a monadic function in fx monad.
In fact, fx is a lift function, but here, the functions generated by fx are also expressed as fx.

1. fx is a composable function using symbols.

There are two ways to compose functions with symbols as shown in the previous section.

Symbol Description Evaluation Order
^ (caret) same as dot(.) mathematical symbol Right-to-Left
| (pipeline) in Unix pipeline manner Left-to-Right

If you don't like function composition using symbols, use cf_.
In fact, it's the most reliable and safe way to use it for all functions.

2. fx is really easy to make.

fx is just a function decorated by @fx.
Wrap any function in fx when you need function composition on the fly.

>>> [1, 2, 3] | sum | (lambda x: x * 7)    # error, lambda is not a 'fx'
TypeError: unsupported operand ...

>>> [1, 2, 3] | sum | fx(lambda x: x * 7)  # just wrap it in 'fx'.
42

>>> @fx
... def func(arg):    # place @fx above the definition or bind 'g = fx(func)'
...     ...           # 'func' is now 'composable' with symbols

Most of the functions provided by foc are fx functions.
If you don't have one, you can just create one and use it.

3. fx is a curried function.

# map := map(predicate, iterable)
# currying 'map' -> map(predicate)(iterable)
>>> map(_ * 8)(seq(1,...)) | take(5)   # seq(1,...) == [1,2,3,..], 'infinite' sequence
[8, 16, 24, 32, 40]                    # seq(1,3,...) == [1,3,5,..]
                                       # seq(1,4,,11) == [1,4,7,10]

# bimap := bimap(f, g, tuple)
# bimap(f, g) := first(f) ^ second(g)  # map over both 'first' and 'second' argument
>>> bimap(_ + 3)(_ * 7)((5, 7))
(8, 49)
>>> (5, 7) | bimap(_ + 3)(_ * 7)
(8, 49)

>>> filterl(_ == "f")("fun-on-functions")    # filterl == (filter | collect)
['f', 'f']
>>> foldl(op.sub)(10)(range(1, 5))
0

@fx
def args(a, b, c, d):
    return f"{a}-{b}-{c}-{d}"

>>> args(1)(2)(3)(4) == args(1,2)(3,4) == args(1,2,3)(4) == args(1)(2,3,4) == args(1,2,3,4)
True

You can get the curried function of g with fx(g).
But if you want to get a curried function other than fx, use curry(g).

4. Lambdas with _ are fx.

>>> [1, 2, 3] | sum | (_ * 7)    # Use '_' lambda instead.
42
>>> ((_ * 6) ^ (_ + 4))(3)       # (3 + 4) * 6
42
>>> 2 | (_ * 7) | (60 % _) | (_ // 3)   # (60 % (2 * 7)) // 3
1

Partial application driven by _ is also possible when accessing dict, object or iterable, or even calling functions. How about using _(_) as a curried function caller?

Operator Equiv Function
_[_] op.getitem
_[item] op.itemgetter(item)
_._ getattr
_.attr op.attrgetter(attr)
_(_) apply
_(*a, **k) lambda f: f(*a, **k)
# dict
>>> d = dict(one=1, two=2, three="three")
>>> _[_](d)("two")  # curry(lambda a, b: a[b])(d)("two")
2
>>> _["one"](d)  # (lambda x: x["one"])(d)
1
>>> cf_(_[2:4], _["three"])(d)  # d["three"][2:4]
're'

# iterable
>>> r = range(5)
>>> _[_](r)(3)  # curry(lambda a, b: a[b])(r)(3)
3
>>> _[3](r)     # (lambda x: x[3])(r)
3

# object
>>> o = type('', (), {"one": 1, "two": 2, "three": "three"})()
>>> _._(o)("two")  # curry(lambda a, b: getattr(a, b))(o)("two")
2
>>> _.one(o)  # (lambda x: x.one)(o)
1
>>> o | _.three | _[2:4]  # o.three[2:4]
're'

# function caller 
>>> _(_)(foldl)(op.add)(0)(range(5))
10
>>> _(7 * _)(mapl)(range(1, 10))
[7, 14, 21, 28, 35, 42, 49, 56, 63]

# Not seriously, this creates multiplication table.
>>> [ mapl(f)(range(1, 10)) for f in _(_ * _)(map)(range(1, 10)) ]

Don't forget that foc is a collection, albeit very odd.

To see all the functions provided by foc, run catalog().

Everything in one place.

  • fx pure basic functions id, const, take, drop, repeat, replicate..
  • higher-order functions like f_, g_, curry, uncurry, flip, map, filter, zip,..
  • function composition tools like cf_, cfd_, ..
  • useful yet very fundamental like seq, force, trap, error, guard,..

Real-World Example

A causal self-attention of the transformer model based on pytorch can be described as follows.
Somebody insists that this helps to follow the process flow without distraction. (plus, 3-5% speed-up)

    def forward(self, x):
        B, S, E = x.size()  # size_batch, size_block (sequence length), size_embed
        N, H = self.config.num_heads, E // self.config.num_heads  # E == (N * H)

        q, k, v = self.c_attn(x).split(self.config.size_embed, dim=2)
        q = q.view(B, S, N, H).transpose(1, 2)  # (B, N, S, H)
        k = k.view(B, S, N, H).transpose(1, 2)  # (B, N, S, H)
        v = v.view(B, S, N, H).transpose(1, 2)  # (B, N, S, H)

        # Attention(Q, K, V)
        #   = softmax( Q*K^T / sqrt(d_k) ) * V
        #         // q*k^T: (B, N, S, H) x (B, N, H, S) -> (B, N, S, S)
        #   = attention-prob-matrix * V
        #         // prob @ v: (B, N, S, S) x (B, N, S, H) -> (B, N, S, H)
        #   = attention-weighted value (attention score)

        return cf_(
            self.dropout,  # dropout of layer's output
            self.c_proj,  # linear projection
            g_(_.view)(B, S, E),  # (B, S, N, H) -> (B, S, E)
            torch.Tensor.contiguous,  # contiguos in-memory tensor
            g_(_.transpose)(1, 2),  # (B, S, N, H)
            _ @ v,  # (B, N, S, S) x (B, N, S, H) -> (B, N, S, H)
            self.dropout_attn,  # attention dropout
            f_(F.softmax, dim=-1),  # softmax
            g_(_.masked_fill)(mask == 0, float("-inf")),  # no-look-ahead
            _ / math.sqrt(k.size(-1)),  # / sqrt(d_k)
            _ @ k.transpose(-2, -1),  # Q @ K^T -> (B, N, S, S)
        )(q)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

foc-0.5.3.tar.gz (22.3 kB view details)

Uploaded Source

Built Distribution

foc-0.5.3-py3-none-any.whl (17.8 kB view details)

Uploaded Python 3

File details

Details for the file foc-0.5.3.tar.gz.

File metadata

  • Download URL: foc-0.5.3.tar.gz
  • Upload date:
  • Size: 22.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.14

File hashes

Hashes for foc-0.5.3.tar.gz
Algorithm Hash digest
SHA256 c7eba91c79c4b4e83755c9ff3f600c96992433b7ebe5c856309d75d1dafb0f6f
MD5 3f5076caa5d1fb15cbb6d5c97c4605c1
BLAKE2b-256 c9cad40be35307f23532f1330170d84835c373306cb61565c7a8a16f8e7bff68

See more details on using hashes here.

File details

Details for the file foc-0.5.3-py3-none-any.whl.

File metadata

  • Download URL: foc-0.5.3-py3-none-any.whl
  • Upload date:
  • Size: 17.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.14

File hashes

Hashes for foc-0.5.3-py3-none-any.whl
Algorithm Hash digest
SHA256 658718d802d57b91102d3dafddfe4f7d99c790831cf9377345673a1540a5a542
MD5 a213556176c90b3440602cd3a8ad965d
BLAKE2b-256 70014caf82ab177acd94c7e3919141e8720ca9d7321fd4f153f0524fa7611356

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page