Fly.io is now free for small Phoenix projects. This post is about building a wicked LiveView app with realtime collaboration features. If you already have a Phoenix app to deploy, try us out. You can be up and running in just a few minutes.
We decided that 2022 was a good year to ship a full-stack Phoenix reference app.
The “full stack” metaphor has progressed beyond its humble beginnings of some REST endpoints and sprinkles of JS and CSS. Showing off a todo app is also no longer state of the art. A reference app should really stress a framework and match the needs of apps being built today. Remember turntable.fm? That’s a more interesting challenge.
A good full-stack framework should help you solve ALL the problems you need to build something like turntable.fm quickly, and then iteratively make it more powerful. Live updates are no longer optional, and a solo full stack developer should be able to deliver on these features with the same productivity of a CRUD Rails app in 2010.
Meet LiveBeats, a social music application we wrote to show off the LiveView UX, while serving as a learning example and a test-bed for new LiveView features. As such, it is, of course, open source — follow the development here!
🔥🎶🔥 Try LiveBeats Now! 🔥🎶🔥
If you’re not familiar with LiveView, our overview states:
LiveView strips away layers of abstraction, because it solves both the client and server in a single abstraction. HTTP almost entirely falls away. No more REST. No more JSON. No GraphQL APIs, controllers, serializers, or resolvers. You just write HTML templates, and a stateful process synchronizes it with the browser, updating it only when needed. And there’s no JavaScript to write.
Here are some of the things LiveBeats demonstrates:
A “live”, shared UI. What one person does is visible to everyone else who’s connected.
File management. Uploads should be quick, with a live UI. And the framework should make it easy to use different storage backends. Third party object storage is one way to do this, but sometimes development is simpler with just a filesystem. LiveView uploads is great for both!
Presence! Apps are more interesting when your friends show up.
We can accomplish all this with just Phoenix and LiveView, in a shockingly small amount of code. It’s also super fast.
To really see what’s special about LiveView and its new features, you need to experience it for yourself. Sign in with GitHub OAuth, upload a playlist of MP3’s, and listen to music with your friends. Playback syncs in real-time, presence shows who is currently listening, and file uploads are processed concurrently as they are uploaded.
Here’s a two-minute demo to see it in action:
Playlist Sync and Presence with Phoenix PubSub
The hallmark of any “live” or social app is seeing your friends’ activity as it happens. In our case, we want to show who is currently listening to a given playlist, and sync the playback of songs as the owner drives the song selection. This is what it looks like:
Phoenix PubSub makes this trivial. In a few lines of code in our business logic, we broadcast updates, then with a few LOC in the LiveViews we listen for the events we care about. When they come in, we update the UI. Everyone connected sees the pages update, even if they are on different horizontally scaled servers.
With friction-free PubSub at your fingertips, any feature that makes sense to be real-time gets to be real-time—even changing URLs on the fly.
For example, user profiles are served at their username, such as livebeats.fly.dev/chrismccord
. But we allow users to update their username in the app, which changes that URL. We don’t want other users listening to their profile to be stuck with an invalid URL that fails on refresh, sharing, or with a click on the profile link.
Handling the URL change took a whopping six lines of code in our Profile LiveView!
def handle_info({MediaLibrary, %PublicProfileUpdated{} = update}, socket) do
{:noreply,
socket
|> assign(profile: update.profile)
|> push_patch(to: profile_path(update.profile))}
end
We are already subscribed to profile notifications, so we just have to handle the PublicProfileUpdated
event, update the template profile state, and then push_patch
to the client to trigger a pushState
browser URL change.
For a traditional application, this kind of feature would take standing up WebSocket connections, ad-hoc HTTP protocols, polling the server for changes, and other complexities.
Let’s see it in action:
Concurrent Upload Processing
LiveView uploads are handled over the existing WebSocket connection, providing interactive file uploads with file progress out-of-the-box - with no user-land JavaScript.
It also allows other neat features like processing the file on the server before writing it to its final location. You might be thinking handling files on your server is sooo mid 2000’s, but hear me out! Servers can have volumes… and you can like write files to them!
Think about it. There aren’t any Lambdas to wire up (and pay for per invocation), no webhooks to wire up, and no external message queues to configure, because it all happens over the existing LiveView connection. Compared to a typical cloud upload solution, we can cut out probably a few paid products and just as many failure modes.
LiveView also guarantees the temporary uploaded file is on the same load-balanced instance of the LiveView processing the page. There’s no external state to jump around between servers. So you write regular code that takes the file, post-processes or verifies the bits on disk, then writes it to its final location – all the while reporting to the UI the progress of each step.
And we do need to extract data out of those binary blobs one way or another, because before we write that MP3 to storage, we want to know that (a) it’s not a malicious file masquerading as an MP3 and (b) it’s less than 20 minutes long.
So the user drags and drops a handful of MP3s into the app, we upload them concurrently over the WebSocket connection, then we concurrently parse the binary MP3 data in a temporary file to verify it’s a valid MP3 of acceptable duration. Once complete, we show the calculated duration on the UI and the user can save their playlist.
An aside: It turns out calculating MP3 duration from file content is actually a pain.
MP3s can contain ID3 tag metadata about the file, but answering the simple question of “how long is this song?” is surprisingly difficult. To properly calculate the duration of an MP3, you must walk all the frames and take bitrate encodings into consideration. There’s a great write-up here on what’s involved with step-by-step Elixir code to make it happen.
Components
LiveView recently shipped a new HEEx template engine that supports React-style template syntax with function components. Function components are reusable functions that encapsulate a bit of markup to be used throughout your UI. Think dropdowns, modals, tables, etc. We have a post all about function components, if you want to dive deeper.
For example, one of our components is a TailwindUI dropdown, which looks like this:
And this is what it looks like in code to use anywhere you’d like a dropdown:
~H"""
<.dropdown id={@id}>
<:img src={@current_user.avatar_url}/>
<:title><%= @current_user.name %></:title>
<:subtitle>@<%= @current_user.username %></:subtitle>
<:link navigate={profile_path(@current_user)}>View Profile</:link>
<:link navigate={Routes.settings_path(Endpoint, :edit)}>Settings</:link>
<:link href={Routes.session_path(Endpoint, :sign_out)} method={:delete}>
Sign out
</:link>
</.dropdown>
"""
Along with function components, LiveView includes slots, which allows a component to specify a named area of the component where arbitrary content can be placed by the caller. The actual dropdown component is just a simple function that encapsulates all the markup and classes of our Tailwind dropdown:
def dropdown(assigns) do
assigns =
assigns
|> assign_new(:img, fn -> nil end)
|> assign_new(:title, fn -> nil end)
|> assign_new(:subtitle, fn -> nil end)
~H"""
<div class="px-3 mt-6 relative inline-block text-left">
...
</div>
"""
end
There’s a lot going on inside the function with Tailwind classes, SVGs, list item building, accessibility attributes, etc, but the internal details aren’t important. The idea is: we define our dropdown in a single place in our application, complete with how it’s styled and accessible, and then we use <.dropdown>
throughout our UI.
Going Global
Elixir is a distributed programming language and we exploit this fully. LiveBeats is deployed on five continents and the servers cluster together automatically over a private network to broker updates. This is what the future of full-stack looks like.
You should serve your full-stack app close to users for the same reason we all agree CDNs are necessary for serving assets close to users. Less latency and more responsiveness improves the experience, and conversions are increased across the board. Folks know this intuitively, but historically we’ve only applied it to assets. With Elixir, our entire stack can take advantage of regional access.
For LiveBeats, we went global with only minor changes. First, we set up file proxying between servers, where we stream data from one region to another. Next, we set up Postgres read replicas in each region and we perform standard replica reads against mostly static data.
We even set up a ping tracker for each user. You can view your own pings, along with the locations and pings of any other visitors, to visualize where your friends are connected across the globe. It’s also neat to see what kind of speedy UX they have on the app.
Check it out:
Here we have a user connecting from the US (iad) and one from Australia (syd), both with fast pings to the regional LiveBeats instances.
Here’s a bonus video giving a window into the process of scaling LiveBeats across regions on Fly.io.
Client-side interactions
The LiveView paradigm necessarily requires a server to be connected, but this doesn’t mean all interactions should go to the server. Operations that can immediately happen on the client should stay on the client.
LiveView has a JS
command interface that allows you to declare client-side effects that work seamlessly with server-issued UI updates. Things like opening modal dialogs, toggling menu visibility, etc happen instantly on the client without the server needing to be aware - just as it should.
Let’s see it in action using the LiveBeats modal:
<.modal
id={@id}
on_confirm={
JS.push("delete", value: %{id: @song.id})
|> hide_modal(@id)
|> hide("#song-#{@song.id}")
}
>
Are you sure you want to delete "<%= @song.title %>"?
<:cancel>Cancel</:cancel>
<:confirm>Delete</:confirm>
</.modal>
When a user attempts to delete a song, we show a modal component asking for confirmation. The key aspect is this: when “Delete” is clicked, our on_confirm
attribute asynchronously pushes a delete event to the server, but immediately hides the modal and song in the playlist.
So we get instant user interaction without any latency, while the song is deleted on the server, just like traditional optimistic UI patterns in a JavaScript framework.
Since the JS
command interface is regular Elixir code, we can compose functions to handle client-UI operations, such as hide
above:
def hide(js \\ %JS{}, selector) do
JS.hide(js,
to: selector,
time: 300,
transition:
{"transition ease-in duration-300", "transform opacity-100 scale-100",
"transform opacity-0 scale-95"}
)
end
This wraps JS.hide
and uses :transition
to give it a slick Tailwind-based CSS transition when hiding elements.
What’s Next
With LiveBeats fleshed out, we’ll continue to add features and use it as a test-bed for in-progress LiveView development. Expect future content deep diving into approaches and considerations when bringing your LiveView application close to users around the world!