Skip to content

mtrunkat/jsonlash

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

jsonlash

oclif Version

CLI utility for filtering and aggregation of JSONL streams. No matter which service for logging you use (LogDNA, Papertrail, Loggly, etc.) simply pipe log into jsonlash set up filters and aggregators and see aggregated data in realtime.

Usage

Installation

Install from NPM globally:

npm install -g jsonlash

After installation you can simply run jsonlash from your terminal with -h parameter to display help page:

jsonlash -h

Basic usage with filtering

We currently use Log DNA as logging service so I am going to use it in examples but it's going to work with any JSONL stream. So pipe your log stream to jsonlash:

logdna tail | jsonlash

Now it will simply print out the log as it comes. So let's filter the API logs that are in the form:

{
    "msg": "API call",
    "req": {
        "duration": 590,
        "method": "GET",
        "route": "V2.datasets.items",
        ...
    }
    ...
}

Filtering is done using -f [FILTER] parameter:

logdna tail | jsonlash -f 'msg=API call'

We can add more filters to filter out only requests with POST method and duration over 1000ms. And also add parameter -e to expand printed JSONs to be more readable:

logdna tail | jsonlash -f 'msg=API call' -f 'req.method=POST' -f 'req.duration>1000' -e

Aggregations

Let's continue with API logs example. To group log lines by request method and compute average and maximal duration call:

logdna tail | jsonlash -f 'msg=API call' -a req.method --max req.duration --avg req.duration

and output will be a table with data aggregated in realtime:

Examples

1.

Aggregate logs by two fields req.method and req.routeName and compute average duration and the maximum duration

... | jsonlash -a req.method -a req.routeName --max req.duration --avg req.duration

2.

Filter out requests taking more than a 10s, grouped them by req.routeName and compute how many users requested each of them:

... | jsonlash -f 'req.duration>10000' -a req.routeName --uni req.userId

Command reference

This is a simple command line tool to filter and aggregate JSONL (json-lines) streams.

USAGE
  $ jsonlash

OPTIONS
  -a, --aggregate=[FIELD]    aggregate JSONL items
  -d, --debug                debug mode, shows JSON parsing errors
  -e, --expand               expand outputted JSON
  -f, --filter=[CONDITION]   filter JSONL items
  -h, --help                 show CLI help

  -v, --version              show CLI version
  --avg=avg                  aggregate average value over all occurrences of given field
  --max=max                  aggregate maximum value over all occurrences of given field
  --min=min                  aggregate minimum value over all occurrences of given field
  --sum=sum                  aggregate sum over all occurrences of given field
  --uni=uni                  aggregate number of unique occurrences of given field

DESCRIPTION
Simply pipe in any JSONL stream and with filter and/or aggregation flags.

If you use only --filter flag then jsonlash outputs filtered jsonl stream.

If you also use --aggregate flag then it renders a table with aggregated data.
Additionally you may add one or more --min|--max|--sum|---avg|--uni flags to
compute aggregated values of given fields.

About

CLI utility for filtering and aggregation of JSONL streams.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published