JSON Lines

On the web

dat uses JSON Lines (newline-delimited JSON) in its streaming APIs

NDJ is a similar format that also allows C++ style comments and blank lines

Bubbles supports JSON Lines datastores

Logstash supports JSON Lines via the json_lines codec

ndjson is a similar format that also allows blank lines

plot.ly uses JSON Lines for its streaming data API

Graylog GELF is format for log messages, their stream is de-facto JSON lines.

Scrapy is a framework for web scraping & crawling, it supports and recommends JSON lines since long -- it might've even coined the term.

ClickHouse is an open source column-oriented DBMS. It supports JSON lines as JSONEachRow format for input and output.

Dataflow kit is a web scraping open source framework written in Go. JSON Lines is on of formats for storing results.

dart uses JSON Lines as one of the possible reporters when running tests.

Apache Spark uses JSONL for reading and writing JSON data.

ArangoDB is an open source multi-model database. The JSON lines format allows to import huge amounts of documents sequentially (via arangoimport).

Rumble is a JSONiq engine that runs on top of Spark. It can process datasets in the JSON lines format that have billions of objects and more.

Neo4j the open-source graph database supports JSONL export and import via its standard library procedures apoc.export/import.json to allow stream processing of nodes and relationships.

petl is a general purpose Python package for extracting, transforming and loading tables of data. It allows importing and exporting documents/records between many databases and file formats, including JSON lines, in local and remote filesystems and clouds.

BigQuery uses JSON Lines as one of the supported formats to load data into the database.

Airbyte is an open-source data integration tool that uses JSON Lines to commnicate between containerized source applications that pull data from files/APIs/databses and containerized destination applications that write data to warehouses.