May 2021 is an exciting time in the Erlang ecosystem!
- May 12 saw the release of Erlang/OTP 24
- BeamAsm JIT compiler
- better error messages
- Elixir 1.12 is imminent
- scripting improvements
- OTP24 integration
- stepped ranged
- additional functions in the standard library
- Livebook
- Axon
This Livebook and associated project files can be found at: https://github.com/rellen/one-twelve-twenty-four
- Interactive code notebook for Elixir built on Phoenix LiveView, by the Elixir Nx team
- https://github.com/elixir-nx/livebook
:erlang.system_info(:otp_release)
BeamAsm provides load-time conversion of Erlang BEAM instructions into native code on x86-64. This allows the loader to eliminate any instruction dispatching overhead and also specialize each instruction on their argument types.
https://erlang.org/doc/apps/erts/BeamAsm.html
https://blog.erlang.org/a-first-look-at-the-jit/
- https://asmjit.com/
- C++ JIT machine code generator
- chosen as it is simpler and faster than e.g. LLVM at the loss of some further optimisations
:erlang.system_info(:emu_flavor)
- minimal code memory increase - important for cache locality, etc
- anecdotally (from Robert Virding) on the order of 10% bigger
- wildly differing speed-ups quoted based on use case
- collating, reviewing, and making new benchmarks is a talk in itself
1 list = Enum.to_list(1..10_000)
2 map_fun = fn i -> [i, i * i + 3 * i / (2 + i)] end
3
4 Benchee.run(
5 %{
6 "map.flatten" => fn -> list |> Enum.map(map_fun) |> List.flatten() end
7 },
8 # save: [path: "save.benchee", tag: "12_24"]
9 load: "save.benchee",
10 time: 15,
11 warmup: 5
12 )
Running this benchmark on Elixir 1.11 w/OTP23 and comparing to 1.12 w/OTP24 shows ~7% speed-up
Comparison:
map.flatten 1.56 K
map.flatten (11_23) 1.45 K - 1.07x slower +45.29 μs
Extended error information for failing BIF calls as proposed in EEP 54
iex(1)> :ets.insert(:foo, {:bar, :baz})
** (ArgumentError) argument error
(stdlib 3.14.2) :ets.insert(:foo, {:bar, :baz})
iex(1)>
:ets.insert(:foo, {:bar, :baz})
ref = :ets.new(:foo, [])
:ets.insert(ref, :bar)
Check out the many, many wonderful changes in OTP 24 at https://www.erlang.org/news/148
System.version()
call Mix.install([deps])
in scripts to avoid requiring a Mix project
# this is a hack to demonstrate how Mix.install works (as opposed to how to use in Livebook, scripts etc)`
code = "IO.puts(Poison.encode!(%{hello: :world}))"
{stdout, _res} = System.cmd("elixir", ["--eval", code], stderr_to_stdout: true)
IO.puts(stdout)
# this is a hack to demonstrate how Mix.install works (as opposed to how to use in Livebook, scripts etc)
# in your scripts and Livebooks, just call the contents of code directly
code = """
Mix.install([:poison], force: true)
IO.puts(Poison.encode!(%{hello: :world}))
"""
{stdout, _res} = System.cmd("elixir", ["--eval", code], stderr_to_stdout: false)
IO.puts(stdout)
parent = self()
System.trap_signal(:sigusr2, :id, fn ->
Process.send(parent, {:msg, "Got SIGUSR2"}, [])
end)
:os.cmd(String.to_charlist("kill -s USR2 #{:os.getpid()}"))
receive do
{:msg, msg} -> IO.puts(inspect(msg))
after
2000 -> IO.puts("oops")
end
System.untrap_signal(:sigusr2, :id)
defmodule ThenTest1 do
defstruct [:field]
end
%{}
|> Map.put(:field, "value")
|> (fn map -> Kernel.struct(ThenTest1, map) end).()
defmodule ThenTest2 do
defstruct [:field]
end
%{}
|> Map.put(:field, "value")
|> then(&struct(ThenTest2, &1))
parent = self()
side_effect = fn x -> Process.send(parent, {:greeting, x}, []) end
"hello"
|> tap(side_effect)
|> String.reverse()
|> IO.puts()
receive do
{:greeting, greeting} -> IO.puts("Got greeting: #{inspect(greeting)}")
after
2000 -> IO.puts("Oops - no greeting")
end
1..10//2
|> Enum.map(fn x -> x * 2 end)
|> IO.inspect()
0..10//2
|> Enum.map(fn x -> x * 2 end)
|> IO.inspect()
:ok
Enum.zip_with([1, 2], [3, 4], fn x, y -> x + y end)
concat = Stream.concat(1..3, 4..6)
Stream.zip_with(concat, concat, fn a, b -> a + b end) |> Enum.to_list()
Check out the many, many wonderful changes in Elixir 1.12 at https://github.com/elixir-lang/elixir/releases
Axon is the Neural Networks library for Elixir Nx that we've been waiting for!
https://github.com/elixir-nx/axon
Mix.install([
{:axon, "~> 0.1.0-dev", github: "elixir-nx/axon", branch: "main"},
{:nx, "~> 0.1.0-dev", github: "elixir-nx/nx", sparse: "nx", override: true}
])
model =
Axon.input({nil, 784})
|> Axon.dense(128)
|> Axon.dense(10, activation: :softmax)