Stream logs to a central service and tail them in your browser. Logflare is different because you can bring your own backend. Simply provide your BigQuery credentials and we stream logs into your BigQuery table while automatically managing the schema.
Sign up at https://logflare.app.
Automatically log structured request/response data in a few clicks with the Cloudflare app.
Setup the Logflare Vercel integration and we'll automatically structure your Vercel logs.
Use our Pino transport to log structured data and exceptions straight from your Javascript project.
Use our Logger backend to send your Elixir exceptions and structured logs to Logflare.
- Official website: https://logflare.app
- All our guides: https://logflare.app/guides
- Support: https://twitter.com/logflare_logs or [email protected]
We are leaving this repo public as an example of a larger Elixir project. We hope to have an open source edition of Logflare at some point in the future.
Logflare is using a SQL parser from sqlparser.com. To set this up on your dev machine:
- Install dependencies with
asdf
usingasdf install
- Copy over secrets to two locations
- Dev secrets -
configs/dev.secret.exs
- Google JWT key -
config/secrets/logflare-dev-238720-63d50e3c9cc8.json
- Dev secrets -
- Start database
docker-compose up -d
- Run
mix setup
for deps, migrations, and seed data. - Restart your postgres server for replication settings to take effect
docker-compose restart
- Run
(cd assets; yarn)
from project root, to install js dependencies - Install
sqlparser
by following the steps in Closed Source Usage section. - Start server
mix start
- Sign in as a user
- Create a source
- Update
dev.secrets.exs
, search for the:logflare_logger_backend
config and update the user api key and source id - Set user api key can be retrieved from dashboard or from database
users
table, source id is from the source page - In
iex
console, test that everything works:
iex> LogflareLogger.info("testing log message")
- Build images with
docker compose build
- Run with
docker compose up -d
Use the :error_string
metadata key when logging, which is for additional information that we want to log but don't necessarily want searchable or parsed for schema updating.
For example, do Logger.error("Some error", error_string: inspect(params) )