# Frequency Counts of Unique Items in a Series

In this thread, you’ll learn different methods and techniques of counting frequency or the number of occurrences of unique items in a series. There are several methods to achieve this task, let’s go over a few of them:

#### 1. Using a Python dictionary:

• This technique involves creating a dictionary and looping through all items in the series and using `get(item, default)` method, we’d be updating the counts.
• `get(item, default)` method allows you to retrieve values associated with keys in the dictionary; `item` is the key to which the value you want to find, and `default` is the value to return if the key is not found.

#### 2. Using "value_counts()" Pandas method:

• The `value_counts()` method returns a series containing counts of unique values in a sequence of values.
• `sort = True` sorts the results according to the values in descending order.

#### 3. Using "np.unique()" method:

• The `np.unique()` method returns unique values from a sequence of values.
• When `return_counts` is set to `True` in the `np.unique()` function, it not only returns the unique values in an array or list but also the count of each unique value.

Note: The `np.unique()` returns separate lists for the unique items and their counts. Therefore, we map the items to their respective counts using the `zip()` function.

#### 4. Using Counter class:

• `Counter` class is located in the `collections` module.
• `Counter` class is a built-in Python class that can be used to count the frequency of elements in a list or a sequence. You can simply pass the series to the `Counter` class and get the frequency counts.

#### 5. Using "groupby()" method:

• The `groupby()` method in `pandas` is used to group rows of a DataFrame or Series object based on one or more columns or indexes.
• After grouping, we can apply various statistical functions or other operations on each group to aggregate or transform the data.
• The `count()` aggregate function is applied in the example below after grouping the data.