Skip to main content

Tools to easily create a word cloud

Project description

Cloudia

Tools to easily create a word cloud.

from string

from str or List[str]

from cloudia import Cloudia

text1 = "text data..."
text2 = "text data..."

# from str
Cloudia(text1).plot()

# from list
Cloudia([text1, text2]).plot()

example from : 20 Newsgroups

sample_img

We can also make it from Tuple.

from cloudia import Cloudia

text1 = "text data..."
text2 = "text data..."
Cloudia([ ("cloudia 1", text1), ("cloudia 2", text2) ]).plot()

Tuple is ("IMAGE TITLE", "TEXT").

from pandas

We can use pandas.

df = pd.DataFrame({'wc1': ['sample1','sample2'], 'wc2': ['hoge hoge piyo piyo fuga', 'hoge']})

# plot from df
Cloudia(df).plot()

# add df method
df.wc.plot(dark_theme=True)

from pandas.DataFrame or pandas.Series.

pandas_img dark_img

We can use Tuple too.

Cloudia( ("IMAGE TITLE", pd.Series(['hoge'])) ).plot()

from japanese

We can process Japanese too.

text = "これはCloudiaのテストです。WordCloudをつくるには本来、形態素解析の導入が必要になります。Cloudiaはmecabのような形態素解析器の導入は必要はなくnagisaを利用した動的な生成を行う事ができます。nagisaとjapanize-matplotlibは、形態素解析を必要としてきたWordCloud生成に対して、Cloudiaに対して大きく貢献しました。ここに感謝の意を述べたいと思います。"

Cloudia(text).plot()

from japanese without morphological analysis module.

japanese_img

No need to introduce morphological analysis.

Install

pip install cloudia

Args

Cloudia args.

Cloudia(
  data,    # text data
  single_words=[],    # It's not split word list, example: ["neural network"]
  stop_words=STOPWORDS,    # not count words, default is wordcloud.STOPWORDS
  extract_postags=['名詞', '英単語', 'ローマ字文'],    # part of speech for japanese
  parse_func=None,    # split text function, example: lambda x: x.split(',')
  multiprocess=True,    # Flag for using multiprocessing
  individual=False    # flag for ' '.join(word) with parse 
)

plot method args.

Cloudia().plot(
    dark_theme=False,    # color theme
    title_size=12,     # title text size
    row_num=3,    # for example, 12 wordcloud, row_num=3 -> 4*3image
    figsize_rate=2    # figure size rate
)

save method args.

Cloudia().save(
    file_path,    # save figure image path
    dark_theme=False,
    title_size=12, 
    row_num=3,
    figsize_rate=2
)

pandas.DataFrame, pandas.Series wc.plot method args.

DataFrame.wc.plot(
  single_words=[],    # It's not split word list, example: ["neural network"]
  stop_words=STOPWORDS,    # not count words, default is wordcloud.STOPWORDS
  extract_postags=['名詞', '英単語', 'ローマ字文'],    # part of speech for japanese
  parse_func=None,    # split text function, example: lambda x: x.split(',')
  multiprocess=True,    # Flag for using multiprocessing
  individual=False,    # flag for ' '.join(word) with parse 
  dark_theme=False,    # color theme
  title_size=12,     # title text size
  row_num=3,    # for example, 12 wordcloud, row_num=3 -> 4*3image
  figsize_rate=2    # figure size rate
)

If we use wc.save, setting file_path args.

Thanks

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cloudia-0.2.2.tar.gz (7.0 kB view details)

Uploaded Source

Built Distribution

cloudia-0.2.2-py3-none-any.whl (7.8 kB view details)

Uploaded Python 3

File details

Details for the file cloudia-0.2.2.tar.gz.

File metadata

  • Download URL: cloudia-0.2.2.tar.gz
  • Upload date:
  • Size: 7.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.5 CPython/3.8.2 Linux/5.0.0-1035-azure

File hashes

Hashes for cloudia-0.2.2.tar.gz
Algorithm Hash digest
SHA256 9bf5419d5c1316bd1d0192409864694b8cb0c8fd605b681e75ad6463fb4a5491
MD5 f89db4523b9f205f56ac72d66eac05a8
BLAKE2b-256 aca736ac48e008455d13372d1ddd1d317b8985256c50a8d0868219eeb06afa9e

See more details on using hashes here.

File details

Details for the file cloudia-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: cloudia-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 7.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.5 CPython/3.8.2 Linux/5.0.0-1035-azure

File hashes

Hashes for cloudia-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 a4965b3a1ec13558eca12269f2e185dd4097d18157ff14963d1216f0ae3ffba9
MD5 4e3cd8850a2ad3e62f08f2432a8c94b0
BLAKE2b-256 0ca86851c2b02da6aec9da00448e2852679fe57d453a4a3618e3792be839c423

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page