Life in numbers

Or how to measure a year when cups of coffee aren’t enough

By Alana

Ah, mid-February. The garbagest time of year. 

New year’s resolutions cracking at the edges. Healthy January habits crumbling beneath oppressive skies, snow, and the interminability of work. The air hurts your face and you may, like me this week, be jailed indoors by mother nature.

My front door this week

To avoid the shame-spiral of once again failing at new year’s resolutions, I didn’t set any this year. Take that, brain!

Let’s do it differently this year, January Alana thought. Instead of setting overly ambitious goals untethered to reality in any conceivable way, what if I spent the glum winter months looking back? Celebrating 2024’s wins and wisdom earned? Would I learn something to help me be more mindful in my intentions for 2025?

2024 in tiny stick people

In December, I purchased a watercolour accordion sketchbook with twelve connected panels. Seemed meant to be for my reflection project - a panel for every month. 

I sorted through photos of 2024, jotting down moments I wanted to remember. Then, I inexpertly sketched these memories into my journal.

Zoom art dates with my bestie and my mum kept me accountable. I finished my 2024 journal last week and peppered the blank spaces between paint with written reflections.

A video of Alana’s 2024 art journal

2024 in numbers the manual way

That left the back panel of the book. A logical spot for a summary. After all, I liked my stats: books read, classes taught, kilometers run and had ample (too many!) data sources. 

The exercise of reducing twelve months of my life into an infographic raised questions.

What tools did I trust the most - Google Timelines, Strava, Garmin, or others - and why were there so many discrepancies? From reading, to shopping, to every city I’d visited, how I got there, and who I was with, a shocking number of my choices were quantifiable and tracked. 

A page from Alana's art journal summarizing 2024 by stats

How the heck did I spend 25% of 2024 asleep?

The resulting graphic in my journal was cute. But it also felt reductive, missing those quiet, precious moments unconveyed by datapoints in apps.

I was deep in this existential waffling when I first heard about Google Whisk.

What’s Whisk got to do with it?

Google Whisk launched in Canada this week. This newest customer-facing AI tool from the tech monolith allows anyone to upload pictures and provide word prompts which are then used to generate completely novel images. 

Whisk’s FAQ explains how it works:

  • I upload images or text into Whisk’s pretty interface

  • Gemini, Google’s AI help-bot, ‘reads’ the images and writes text captions for them

  • Google’s Imagen 3 generative art tool reads the caption Gemini wrote and creates its own detailed prompt

  • Imagen 3 uses this prompt to generate an image 

Image turns to text and then more text and then back into an image. 

But wait! Weren’t these essentially the same steps I took to make my watercolour journal? 

I began with photos and wrote text prompts (‘mum’s second stroke’, ‘Cindy’s wedding,’ ‘my LOTR Steve tattoo’ etc) to summarize each month of 2024. I tabulated data about my life from multiple sources. I then created paintings from that inspiration.

Photos into text and numbers into paintings.

2024 as translated by generative AI 

As a sci-fi reader, I have a technocynic’s fascination with generative AI and all its ethical quandaries. Is big tech compensating the artists whose work is used to train gen AI models? How do creator biases manifest and what are we doing to counter the massive environmental costs of running diffusion models at scale?

Can algorithms convincingly capture the idiosyncrasies of human speech and art?

I decided to investigate, using all this data and art I’d amassed. 

First, I spat my movement data into three gen AI tools. (Using only one AI model is like reading only one newspaper. These are tools with slant - seek multiple perspectives.)

What can you tell me about the person this data represents, I asked. 

Some gems:

  • “Your cycling habits show a clear connection to warmer weather, and your running volume fluctuates throughout the year, suggesting you are somewhat influenced by seasonal changes. You utilize multiple apps to track your runs, indicating an interest in data and potentially using it to monitor your progress and understand your activity levels.” - Gemini

  • “Your running gear is practical but well-loved—a pair of reliable running shoes, moisture-wicking clothes, and maybe a lightweight jacket for cooler days.” (hahaha) “Your fitness tracker (likely a Fitbit or Garmin) is snug on your wrist, syncing seamlessly with Strava, where you proudly log your runs and occasionally share them with friends or a community of fellow runners.” - Deepseek

  • ”While your increased activity in the latter part of the year is commendable, maintaining a consistent running schedule throughout the year can lead to better overall fitness and performance.” - ChatGPT

I fed these AI-generated statements about myself into generative art tools, asking them to illustrate the person described by the text.

Multiple tools struggled with visually depicting my alleged obsession with data. I’m also pretty confident that reading the text in some of these images backwards is actually an invocation for an eldritch ritual.

Whisk was the most fun - I fed it images from my journal, photos from my year, and text prompts. 

To generate the above in Whisk, I fed it the ‘year in numbers’ from my art journal, a photo of me running, and a previous image Whisk generated based on my 2024 movement data.

How do you measure a year?

My dataset was tiny, my methods unscientific. Did I discover something profound? Meh.

My biggest insight was simple and deeply low-tech: take the time to notice.

Creating my journal made me tear up more than once. I had to, for example, decide how to honour two friends who died last year. Both were around my age - what could I possibly paint to convey the light the world lost in each of them?

Querying AI models about my photos and stats provoked different ideas. The models had a tendency to future-plan in ways I didn’t feel ready for.

“It would be interesting to see how these patterns continue into 2025 and beyond!” Gemini said. “Is there anything in this ‘data picture’ that you feel is particularly accurate or surprising?”

I don’t know, Gemini. 

But I do know that when the world is heavy, I’ve found solace in a winter slow down. That I’ll try to carry some of the spirit of this season into the rest of my year.

Take the picture, paint, breathe, and you’ll notice more. Share what you saw looking back over your shoulder with others - human or AI. 

Maybe they can help you dream about what comes next.

Next
Next

A year of reading in review