A brief introduction to Julia¶

Alexis Montoison, Valentin Churavy, Mosè Giordano

In [ ]:
import Pkg
Pkg.activate("colab1")

What's Julia? 🟢 🟣 🔴¶

Julia is a modern, dynamic, general-purpose, compiled programming language. It's interactive ("like Python"), can be used in a REPL or notebooks, like Jupyter (it's the "Ju"). Julia has a runtime which includes a just-in-time (JIT) compiler and a garbage collector (GC), for automatic memory management.

Julia is mainly used for technical computing, and addresses a gap in the programming language landscape for numerical computing.

Main paradigm of Julia is multiple dispatch, what functions do depend on type and number of all arguments.

From "My Target Audience" by Matthijs Cox:

No description has been provided for this image No description has been provided for this image

What is the 2 language problem?¶

You start out prototyping in one language (high-level, dynamic), but performance forces you to switch to a different one (low-level, static).

  • For convinience use a scripting language (Python, R, Matlab, ...)
  • but do all the hard stuff in a systems language (C, C++, Fortran)

Pragmatic for many applications, but has drawbacks

  • aren't the hard parts exactly where you need an easier language
  • creates a social barrier -- a wall between users and developers
  • "sandwich problem" -- layering of system and user code is expensive
  • prohibits full stack optimisations

Why Julia? 😍¶

  • Easy to read and write
  • Fast like C, but simple like Python
  • Works well with your own data and functions
  • Lets you write code that looks like the math you mean
  • No need to switch languages for performance...
  • ...but you can still call Fortran / C-like shared libraries if you want to
  • MIT licensed: free and open source
  • Excellent native GPU computing support

Getting started with Julia¶

Modern Julia Workflows is an excellent resource to get started with.

Installation¶

Use juliaup

curl -fsSL https://install.julialang.org | sh
Resources¶
  • Modern Julia Workflows: https://modernjuliaworkflows.org
  • Discourse: https://discourse.julialang.org
  • Documentation: https://docs.julialang.org
  • Community Calendar: https://julialang.org/community/#events

Package manager¶

One package manager, provided together with the language.

  • Native notion of "environment"
  • Project.toml: Describes the dependencies and compatibilities
  • Manifest.toml: Record of precise versions of all direct & indirect dependencies
No description has been provided for this image

Binaries included¶

Major usability pain points of modern languages is the integration of dependencies from Fortran/C/C++, reliably across multiple operating systems.

Julia provides JLL packages that wrap binaries, and automatically install the right one for your current platforms.

  • Binarybuilder: (https://binarybuilder.org/) --> Sandboxed cross-compiler
  • Yggdrasil: (https://github.com/JuliaPackaging/Yggdrasil/) --> Collection of build recipes

Interfacing with C and Fortran libraries¶

  • Julia has direct support for foreign function calls

  • @ccall → call C/Fortran directly

  • @cfunction → expose Julia functions as C callbacks

  • Automatic wrapper generation with Clang.jl

  • ⚠️ Careful with garbage collection when passing pointers!

In [ ]:
using LinearAlgebra
import LinearAlgebra.BLAS.libblas

function dgemm(transa, transb, m, n, k, alpha, A, lda, B, ldb, beta, C, ldc)
    return @ccall libblas.dgemm_64_(transa::Ref{UInt8}, transb::Ref{UInt8},
                                    m::Ref{Int64}, n::Ref{Int64}, k::Ref{Int64},
                                    alpha::Ref{Float64}, A::Ptr{Float64},
                                    lda::Ref{Int64}, B::Ptr{Float64}, ldb::Ref{Int64},
                                    beta::Ref{Float64}, C::Ptr{Float64}, ldc::Ref{Int64},
                                    1::Clong, 1::Clong)::Cvoid
end

A = rand(Float64, 3 ,3)
B = rand(Float64, 3, 3)
C = zeros(Float64, 3, 3)

# C = 1.0*A*B + 0.0*C
dgemm('N', 'N', 3, 3, 3, 1.0, A, 3, B, 3, 0.0, C, 3)
C

👉 * and mul! transparently use the optimized BLAS library.

In [ ]:
D = zeros(Float64, 3, 3)
mul!(D, A, B)   # calls BLAS dgemm under the hood
D == A*B        # true

Multiple dispatch in action 🚀¶

In Julia, the function that runs depends on the types of all arguments:

In [ ]:
# Same function name, different methods
area(radius::Float64) = π * radius^2                 # Circle
area(width::Float64, height::Float64) = width*height # Rectangle

println(area(3.0))        # uses the circle method
println(area(2.0, 5.0))   # uses the rectangle method
foo(x::Int)     = x^2      # square integers
foo(x::Float64) = sqrt(x)  # square-root floats

println(foo(4))    # → 16 (square)
println(foo(9.0))  # → 3.0 (square root)

👉 Julia picks the right version automatically.

Works with user-defined types¶

In [ ]:
abstract type NumberLike end

struct Dual{T<:Real} <: NumberLike
    primal::T
    tangent::T
end

# Define + for Dual numbers
Base.:+(x::Dual, y::Dual) = Dual(x.primal + y.primal,
                                 x.tangent + y.tangent)

println(Dual(1.0, 2.0) + Dual(3.0, 4.0))

👉 Here we added a new “kind of number” (Dual), and Julia’s multiple dispatch makes it work seamlessly with operators like +.

Compilation of a dynamic language.¶

Julia's compiler uses LLVM, a widely used open-source tool that helps turn code into fast machine instructions.

No description has been provided for this image

How Julia turns code into machine instructions¶

  1. Parsing → read your code and turn it into a syntax tree
  2. Lowering → simplify the syntax tree into a more uniform form
  3. Type inference → guess the types of variables and expressions
  4. High-level optimizations → improve the code while it’s still in Julia’s own form
  5. Code generation → translate Julia code into LLVM IR (an intermediate language)
  6. LLVM optimizations → LLVM applies many generic optimizations
  7. LLVM backend → LLVM translates IR into machine code for your CPU/GPU
  8. Native code → final executable instructions that run on your computer
In [ ]:
Meta.@dump 1.0 + 2.0
In [ ]:
@code_typed optimize=false 1.0 + 2.0
In [ ]:
@code_lowered 1.0 + 2.0
In [ ]:
@code_warntype 1.0 + 2.0
In [ ]:
@code_llvm debuginfo=:none 1.0 + 2.0
In [ ]:
@code_native 1.0 + 2.0