Logger comes packaged with the language. It is free of charge and packed with outstanding features. It’s a sin against basic principles of Alchemy not to use it in your projects to gain more visibility of what’s going on in your system.

While you can - and possibly should - add more Logger.info/warn/error statements to your own functions, there is a simple hack that increases your chances for figuring out what’s going on in the system. All you have to do is to set more metadata on your existing log entries.

Logging in a Phoenix project

I assume you use Phoenix framework to build your web application / API / GraphQL endpoint. If you do, Logger is likely pre-configured and already extended with some amount of metadata.

Spawn a Phoenix project, make few requests and look what it outputs to the console. It should be something like:

...
12:38:20.770 request_id=2hq9kidh7f862ub7hvlt3hf082t2i46k [info] GET /people
12:38:20.772 request_id=2hq9kidh7f862ub7hvlt3hf082t2i46k [debug] QUERY OK source="users" db=0.7ms queue=0.1ms
SELECT u0."id", u0."first_name", u0."last_name", u0."email", u0."password_hash", u0."role", u0."archived", u0."password_reset_token", u0."telephone", u0."misc", u0."passcode", u0."auth_token", u0."login_attempts", u0."permitted_reports", u0."inserted_at", u0."updated_at" FROM "users" AS u0 WHERE (u0."id" = $1) [1]
12:38:20.772 request_id=2hq9kidh7f862ub7hvlt3hf082t2i46k [debug] Processing by MyApp.AppController.show/2
  Parameters: %{"a" => ["people"]}
  Pipelines: [:browser]
12:38:20.772 request_id=2hq9kidh7f862ub7hvlt3hf082t2i46k [info] Sent 200 in 1ms
12:38:21.188 request_id=t8mee407rmc6gorpu4h4f3bltjr5das7 [info] POST /api/query
12:38:21.189 request_id=t8mee407rmc6gorpu4h4f3bltjr5das7 [debug] QUERY OK source="users" db=0.4ms queue=0.1ms
SELECT u0."id", u0."first_name", u0."last_name", u0."email", u0."password_hash", u0."role", u0."archived", u0."password_reset_token", u0."telephone", u0."mis
c", u0."passcode", u0."auth_token", u0."login_attempts", u0."permitted_reports", u0."inserted_at", u0."updated_at" FROM "users" AS u0 WHERE (u0."id" = $1) [1
]
12:38:21.189 request_id=t8mee407rmc6gorpu4h4f3bltjr5das7 [debug] Processing by MyApp.QueryController.execute/2
  Parameters: %{"name" => "ListPeople", "payload" => %{}}
  Pipelines: [:api]
12:38:21.189 request_id=t8mee407rmc6gorpu4h4f3bltjr5das7 [debug] QUERY OK source="people" db=0.1ms queue=0.1ms
SELECT p0."id", p0."first_name", p0."last_name", p0."social_security", p0."dob", p0."relationship", p0."details", p0."client_id", p0."created_client_id", p0.
"inserted_at", p0."updated_at" FROM "people" AS p0 ORDER BY ARRAY[p0."first_name",p0."last_name"] []
12:38:21.189 request_id=t8mee407rmc6gorpu4h4f3bltjr5das7 [info] Sent 200 in 1ms
...

Let’s deconstruct what we see above. Plug.Logger is responsible for the lines that output: GET /people and Sent 200 in 1ms.

Ecto adds the QUERY OK source="users" db=0.7ms queue=0.1ms and detailed SQL query that was executed.

Phoenix.Logger adds the Processing ... statement.

We can see the libraries we use provide already a good visibility of what is going on in the system: we can figure out which path was called, which SQL queries were executed (likely disabled in production, however), which controller was called and parameters it received. Sweet!

We can also see the timestamps (i.e. 12:38:20.772), log level ([info] or [debug]) and single piece of metadata Phoenix sets and prints for us by default: request_id=2hq9kidh7f862ub7hvlt3hf082t2i46k.

The request_id metadata is added by Plug.RequestId, set up for you by Phoenix in endpoint that was written, along with other files, when you initially generated the project.

Then, Phoenix configures Logger not to ignore the piece of metadata, and output it to the console. You will find something like this in your config/config.exs:

# config/config.exs
# Configures Elixir's Logger
config :logger, :console,
  format: "$time $metadata[$level] $message\n",
  metadata: [:request_id]

If you remove the :request_id from the configuration above, and restart the project, the logs will no longer contain this additional piece of information.

Extending log statements with more metadata Logger’s metadata is attached to the current process, and is being present in all log entries that followed setting that piece of metadata on. Our request_id is being set once, in Plug.RequestId, and is present in all log entries made by Phoenix, Ecto or your own code that follow it. This comes exceptionally handy in case when you have log entries that do get mixed up in your log files / logger backend like this:

...
12:38:20.770 request_id=2hq9kidh7f862ub7hvlt3hf082t2i46k [info] GET /people
12:38:20.772 request_id=t8mee407rmc6gorpu4h4f3bltjr5das7 [info] POST /api/query
12:38:20.772 request_id=2hq9kidh7f862ub7hvlt3hf082t2i46k [debug] Processing by MyApp.AppController.show/2
  Parameters: %{"a" => ["people"]}
  Pipelines: [:browser]
12:38:21.188 request_id=2hq9kidh7f862ub7hvlt3hf082t2i46k [debug] QUERY OK source="users" db=0.7ms queue=0.1ms
SELECT u0."id", u0."first_name", u0."last_name", u0."email", u0."password_hash", u0."role", u0."archived", u0."password_reset_token", u0."telephone", u0."misc", u0."passcode", u0."auth_token", u0."login_attempts", u0."permitted_reports", u0."inserted_at", u0."updated_at" FROM "users" AS u0 WHERE (u0."id" = $1) [1]
12:38:21.192 request_id=2hq9kidh7f862ub7hvlt3hf082t2i46k [info] Sent 200 in 1ms
12:38:21.189 request_id=t8mee407rmc6gorpu4h4f3bltjr5das7 [debug] QUERY OK source="users" db=0.4ms queue=0.1ms
SELECT u0."id", u0."first_name", u0."last_name", u0."email", u0."password_hash", u0."role", u0."archived", u0."password_reset_token", u0."telephone", u0."mis
c", u0."passcode", u0."auth_token", u0."login_attempts", u0."permitted_reports", u0."inserted_at", u0."updated_at" FROM "users" AS u0 WHERE (u0."id" = $1) [1
]
12:38:21.189 request_id=t8mee407rmc6gorpu4h4f3bltjr5das7 [debug] Processing by MyApp.QueryController.execute/2
  Parameters: %{"name" => "ListPeople", "payload" => %{}}
  Pipelines: [:api]
12:38:21.189 request_id=t8mee407rmc6gorpu4h4f3bltjr5das7 [debug] QUERY OK source="people" db=0.1ms queue=0.1ms
SELECT p0."id", p0."first_name", p0."last_name", p0."social_security", p0."dob", p0."relationship", p0."details", p0."client_id", p0."created_client_id", p0.
"inserted_at", p0."updated_at" FROM "people" AS p0 ORDER BY ARRAY[p0."first_name",p0."last_name"] []
12:38:21.189 request_id=t8mee407rmc6gorpu4h4f3bltjr5das7 [info] Sent 200 in 1ms
...

In order to separate those one could search for request_id=2hq9kidh7f862ub7hvlt3hf082t2i46k and filter it down to requests only related to signle HTTP request that is of interest to our debugging purpose:

12:38:20.770 request_id=2hq9kidh7f862ub7hvlt3hf082t2i46k [info] GET /people
12:38:20.772 request_id=2hq9kidh7f862ub7hvlt3hf082t2i46k [debug] QUERY OK source="users" db=0.7ms queue=0.1ms
SELECT u0."id", u0."first_name", u0."last_name", u0."email", u0."password_hash", u0."role", u0."archived", u0."password_reset_token", u0."telephone", u0."misc", u0."passcode", u0."auth_token", u0."login_attempts", u0."permitted_reports", u0."inserted_at", u0."updated_at" FROM "users" AS u0 WHERE (u0."id" = $1) [1]
12:38:20.772 request_id=2hq9kidh7f862ub7hvlt3hf082t2i46k [debug] Processing by MyApp.AppController.show/2
  Parameters: %{"a" => ["people"]}
  Pipelines: [:browser]
12:38:20.772 request_id=2hq9kidh7f862ub7hvlt3hf082t2i46k [info] Sent 200 in 1ms

This is useful, yet I find myself more often in need to search for all requests made by given user - to re-construct the so called “user journey”, i.e. steps that the user took (or attempted to take) on the platform.

In the source code of the application, we already likely have a custom Plug module that finds currently signed in user by cookie, and assigns it to conn.assigns.current_user variable. Let’s say it looks like this:

# plugs/auth.ex
defmodule MyApp.Plugs.Auth do
  import Plug.Conn
  def init(_opts) do
    :ok
  end

  def call(conn, repo) do
    conn = fetch_cookies conn

    auth_token = conn.req_cookies["auth_token"]

    case Phoenix.Token.verify(MyApp.Endpoint, "user_id", auth_token) do
      {:ok, user_id} ->
        user = MyApp.Repo.get(MyApp.User, user_id)
        case user.auth_token == auth_token do
          true ->
            assign(conn, :current_user, user)

          _ ->
            assign(conn, :current_user, nil)
        end

      _ ->
        assign(conn, :current_user, nil)
    end
  end
end

We can extend our logs by making sure we are also setting the user_id metadata for the logger:

...
case user.auth_token == auth_token do
  true ->
    Logger.metadata(user_id: user.id)
    assign(conn, :current_user, user)

  _ ->
    assign(conn, :current_user, nil)
end
...

and also configuring our Logger to use that information when printing to console:

# Configures Elixir's Logger
config :logger, :console,
  format: "$time $metadata[$level] $message\n",
  metadata: [:request_id, :user_id]

What we can see now in the log statements is something more useful:

13:11:36.139 request_id=qu7sv31plrb5svf68clq8eo9511g89gn user_id=1 [debug] QUERY OK source="clients" db=0.8ms queue=0.5ms
SELECT c0."id", c0."first_name", c0."last_name", c0."social_security", c0."dob", c0."tel", c0."email", c0."postcode", c0."no_share", c0."archive_details", c0."inserted_at", c0."updated_at" FROM "clients" AS c0 ORDER BY ARRAY[c0."first_name",c0."last_name"] []
13:11:36.140 request_id=qu7sv31plrb5svf68clq8eo9511g89gn user_id=1 [info] Sent 200 in 3ms

Note that the user_id starts appearing in the log statements when our MyApp.Plugs.Auth is being called, and we have to associated it with the request path by matching request_id metadata. This tends to get tedious, and I like to de-normalize it by adding the HTTP method and path to metadata itself too:

defmodule MyApp.MetadataLogger.Plug do
  require Logger
  @behaviour Plug

  def init(_opts), :ok

  def call(conn, :ok) do
    Logger.metadata(http_method: conn.method)
    Logger.metadata(http_path: conn.request_path)

    conn
  end
end

in Endpoint:

plug(Plug.RequestId)
plug(MyApp.MetadataLogger.Plug)
plug(Plug.Logger)

in config.exs:

config :logger, :console,
  format: "$time $metadata[$level] $message\n",
  metadata: [:http_method, :http_path, :request_id, :user_id]

And our log statements look like this now:

13:34:49.767 http_method=POST http_path=/api/query request_id=9h187ro7556tu69hrhujobmpb58be72b [info] GET /people                          
13:34:49.769 http_method=GET http_path=/people request_id=9h187ro7556tu69hrhujobmpb58be72b [debug] QUERY OK source="users" db=1.2ms queue=0.3ms    
SELECT u0."id", u0."first_name", u0."last_name", u0."email", u0."password_hash", u0."role", u0."archived", u0."password_reset_token", u0."telephone", u0."mi$c", u0."passcode", u0."auth_token", u0."login_attempts", u0."permitted_reports", u0."inserted_at", u0."updated_at" FROM "users" AS u0 WHERE (u0."id" = $1) [$]
13:34:49.769 http_method=GET http_path=/people request_id=9h187ro7556tu69hrhujobmpb58be72b user_id=1 [debug] Processing by MyApp.AppController.show/2
  Parameters: %{"a" => ["people"]}
  Pipelines: [:browser]
13:34:49.769 http_method=GET http_path=/people request_id=9h187ro7556tu69hrhujobmpb58be72b user_id=1 [info] Sent 200 in 2ms

This allows for easy searching for log statements related to particular user, request, path etc.

You can use the same approach to add more useful metadata entries, one that comes to mind and I use often is tenant_id or company_id, i.e. identifying context of organization that uses given multi-tenant application.

Note that adding more metadata will grow your log files significantly. You need to find the balance that is useful for you and your application / business purposes. Debugging with relevant metadata is a breeze, however.

Other uses

Since Logger’s metadata is set per-process, it begs to be used whenever you spawn dedicated processes such as GenServer. Often, a GenServer process can be identified by an ID, name or other thing that is helpful to debug issues with. You can use Logger.metadata/1 to set such identification metadata inside init/1 callback, so that all the log entries emitted by given process will be tagged accordingly.

Logger’s metadata can be also used to implement audit trail-like functionality, where you log user_id (we did this already!) and some resource_id, and log statements that describe what changes were made. Paired with document-storage and Logger backend that saves data in it, it can be relatively easy and universal way of building audit trail / auditing framework for your application.

The metadata is also used by third party tools and libraries, such as Spandex - to provide additional debugging and tracing abilities.

Post by Hubert Łępicki

Hubert is partner at AmberBit. Rails, Elixir and functional programming are his areas of expertise.