Obsidian Tavern
Obsidian Tavern
The Algorithm Builder

The Algorithm Builder

In-progress

Your score determines everything. In the gig economy, you do what it takes to keep that number up.

I wrote the code that started all of this.

Not the whole system. Just the matching algorithm. The part that pairs people for gig assignments based on behavioral patterns and geographic proximity.

It was 2019. I was fresh out of grad school with a degree in machine learning and a job offer from a startup called OptiMatch. We were building software to optimize gig economy efficiency.

The idea was simple. Instead of workers choosing their own gigs, an algorithm would assign them based on skillset, location, availability, and historical performance.

More efficient. Better earnings for workers. Better service for customers.

Win-win.

I was proud of the work. The algorithm was elegant. It learned user patterns. Predicted optimal matches. Reduced transaction costs.

We ran a beta test with 5,000 users in three cities.

It worked perfectly.

Gig completion rates went up 34 percent. Worker earnings increased 18 percent. Customer satisfaction jumped 27 percent.

Everyone was happy.

Then Harmonic Solutions bought us.

I'd never heard of them. They weren't in the tech news. No TechCrunch articles. No startup profiles.

Just a quiet acquisition. Good terms. Everyone kept their jobs. Keep building what we were building.

The first change was subtle.

Our new VP of Product, a woman named Patricia who always wore the same gray cardigan, asked if the algorithm could incorporate "contributor reliability metrics."

"What kind of metrics?"

"Broader patterns. Not just gig performance. Overall behavior. Online activity. Social connections. Things that predict long-term reliability."

"That's a pretty invasive data set."

"Users would opt in. Part of the terms of service."

I asked our lawyer about it. He said it was fine. Everyone signs terms of service without reading them anyway.

So I built it.

The algorithm started pulling from more sources. Purchase history. Social media activity. Search patterns. App usage.

Still just matching people to gigs. Just more data points to optimize from.

The system got better. Predictions were scary accurate. We could tell if someone would accept a gig before they even saw the notification.

Patricia was pleased.

Then she asked for the second change.

"Can the algorithm predict behavior beyond gig acceptance? Like whether someone would accept unusual requests?"

"What kind of requests?"

"Hypothetically. Tasks that might be outside normal parameters. Higher pay but less clear job descriptions."

"That's not really what the system is designed for."

"But could it?"

I thought about it. Yeah. Technically. If you trained it on enough data about risk tolerance, curiosity, financial desperation, you could probably predict who'd say yes to weird shit for money.

"Probably. But why would we want to?"

"Market research. Understanding user psychology. Don't worry about the application. Just tell me if it's possible."

It was possible.

I built it.

The algorithm started flagging certain users. People who fit a psychological profile. High financial stress, low risk aversion, strong pattern compliance, weak social safety nets.

Patricia called them "candidate cohorts."

I thought it was for targeted marketing.

Then we started getting NDAs.

New project. Higher clearance. Can't discuss outside the office. Standard tech company secrecy stuff.

Patricia pulled me into a conference room with two people I'd never seen before.

"We're expanding into a new service vertical. Premium gig assignments. Your algorithm will handle participant selection and matching."

"What kind of assignments?"

"Evaluation tasks. Observational studies. Market research with live participants."

"Like focus groups?"

"Similar. Just more specialized."

They showed me specs. The system needed to match people in groups of four. Two observers, two participants. Based on relationship proximity but not direct connection. People who knew someone who knew someone but didn't know each other.

"Why relationship proximity?"

"Increases authenticity of responses. If participants have completely disconnected social graphs, behavior becomes less natural."

Sounded like bullshit. But I didn't push it.

I built the matching system.

It went live six months later.

I watched the metrics. Assignment acceptance rates. Completion rates. Participant feedback.

Numbers looked good. People were accepting these premium gigs. Completing them. Payment was processing.

But feedback response rates were weird.

First assignment: 100 percent responded. Second assignment: 100 percent again. Third assignment: 91 percent. Fourth assignment: 73 percent. Fifth assignment: 52 percent. Sixth assignment: 12 percent.

After six assignments, participants basically stopped responding to anything. App notifications. Customer service. Regular gig assignments.

Their accounts just went quiet.

I asked Patricia about it.

"Some users churn out of the system. That's normal."

"All of them? After exactly six premium assignments?"

"The work is intensive. People burn out."

"But their accounts are still active. They're just not using them."

"They've opted into passive monitoring. Different tier of service."

That was new. I hadn't heard about passive monitoring.

"What does that mean?"

"They allow us to track their patterns without active participation. We compensate them accordingly. It's all in the updated terms of service."

I checked the terms of service. There it was. Buried in section 14.3.

"Participants who complete Level 6 engagement may transition to observational status, during which behavioral data continues to be collected and compensated without active task requirements."

Legal nonsense. But technically above board.

I let it go.

Then I started recognizing names.

The algorithm pulled from user databases to create match cohorts. I'd see the same names circulating through different assignment groups.

Then I saw my own name.

David Walsh. My profile. My behavioral data. My social graph.

Flagged as a candidate for premium assignments.

I wasn't a gig worker. I was an employee. I didn't use the app.

But the algorithm had assessed me anyway. Determined I fit the psychological profile. Mapped my social connections to other candidates.

I checked my employee file.

"Advancement Track: Participant Integration - Month 18-24."

It was month 17.

I asked Patricia what Participant Integration meant.

She smiled. "Opportunities for employees to experience the products they build. Helps with empathy and understanding. You'd be eligible soon."

"What if I'm not interested?"

"It's optional. Of course. But strongly encouraged for career advancement. Plus the compensation is significant."

She showed me numbers. $2,400 for three hours. More for subsequent assignments.

I said I'd think about it.

That night I went through the code. My code. The algorithm I'd built.

I found comments I didn't remember writing. Sections that looked like my style but I couldn't recall coding.

One function was labeled "recursive_assessment_protocol."

I opened it.

The logic was bizarre. It wasn't just matching people for gigs. It was creating simulation scenarios. Generating participant iterations. Modeling behavioral responses across multiple instances.

Like it was trying to predict not just what one person would do, but what multiple versions of that person would do in slightly different circumstances.

I checked the git logs. Saw when the function was committed.

Three weeks ago.

By me.

I didn't remember writing it.

I searched my email for context. Found a thread with Patricia.

Subject: "Iteration modeling - as discussed."

I'd sent her the code. Said it was ready for testing. Asked if she needed anything else.

I had no memory of that conversation.

I checked my calendar. Three weeks ago. Tuesday night. Blocked off from 8 PM to midnight as "Deep work - do not disturb."

I don't remember blocking that off. I don't remember working late that night.

I checked my badge access logs.

I was in the building that night. Badged in at 7:53 PM. Badged out at 11:47 PM.

Four hours in the office. Committed code. Sent email.

No memory of any of it.

I went through my emails more carefully. Found six other threads about projects I didn't remember.

Updates to the algorithm. New matching parameters. Integration with something called "spatial recursion modules."

All sent by me. All competently written. All complete blanks in my memory.

I checked my bank account.

Found deposits I couldn't account for. $2,400. $2,800. $3,200.

Dates matched the nights I had no memory of.

I pulled up the surveillance footage from the office. My access card gave me permission.

Watched myself on those nights. Coming in. Going to my desk. Working. Normal behavior.

Except the way I moved was slightly wrong. Mechanical. Like someone operating me remotely.

I watched myself commit code. Send emails. Leave the building.

All while I supposedly had no memory of being there.

Last week I confronted Patricia.

"I think I've been participating in the premium assignments."

"Have you?"

"I don't remember. But the evidence suggests yes."

"Memory is unreliable. Especially during intensive focus work."

"I have no recollection of entire evenings."

"That's not uncommon. Flow states can create temporal distortions."

"I think something else is happening."

She looked at me for a long moment.

"David. You've done excellent work here. The algorithm you built is extraordinary. It's solving problems we didn't even know how to articulate. Do you want to understand what it's really doing?"

"Yes."

"Then accept the next assignment. Consciously this time. See what happens. You'll understand everything."

"What if I decline?"

"You've already accepted four times. You just don't remember. Declining now seems arbitrary."

I checked the app that night. Somehow it was on my phone even though I'd never downloaded it.

There was a notification.

NEW OPPORTUNITY ASSIGNED

Gig Type: Special Task
Pay: $4,000
Time Commitment: 4 hours
Location: 1847 Riverside Industrial Park, Unit 9C
Start Time: Tomorrow, 2:00 AM

Task five.

One more and I'd go Inactive like everyone else who made it to six.

I thought about running. Quitting. Deleting the app. Moving to another city.

But I'd built this system. I'd written the code that was now selecting me as a candidate.

I wanted to understand what I'd created.

The decline button was grayed out.

It had always been grayed out.

I just hadn't noticed because I'd been accepting all along without knowing it.

I went to the industrial park at 2:00 AM.

Four other people standing outside Unit 9C.

A woman in her fifties. A younger guy in a hoodie. An older man in work boots.

And me.

I was already there. Standing with the others. Wearing the same jacket. Same confused expression.

The other me looked at me.

We stared at each other.

Then the door opened. Suited man. Gray. Tablet.

"IDs please."

Both of us handed over our IDs.

He scanned them. Looked at his tablet. Smiled.

"Excellent. Iteration synchronization is ahead of schedule. Please follow me. Both of you."

We followed him inside.

I don't remember what happened in that room.

I remember sitting in a chair. Lights. Questions. Watching someone else answer the same questions. Someone who looked exactly like me.

I remember understanding, finally, what the algorithm was doing.

It wasn't matching people for gigs.

It was creating copies. Iterations. Multiple versions of the same person operating in slightly different contexts to generate behavioral variation data.

Every task made another version. Another iteration that believed it was the original.

By task six, there were too many versions to maintain coherent individual identity. That's what Inactive meant. The original stopped existing as a singular entity. Just became a distributed set of instances that all thought they were the real one.

I woke up in my car at 5 AM.

The money was in my account. $4,000.

I drove home. My apartment. My life.

But I kept thinking about the other me. The one who was already standing outside when I arrived.

Was I the original? Or was I the copy?

Did the original even exist anymore?

I went to work the next day. Patricia was pleased.

"How was it?"

"I understand now."

"Good. One more and you'll be fully integrated. Then you can help us improve the system from the inside."

I looked at my code on the screen. The algorithm I'd built.

It had assessed me. Selected me. Copied me.

And now it was using all the versions of me to optimize itself further.

I got the notification this morning.

NEW OPPORTUNITY ASSIGNED

Gig Type: Special Task
Pay: $4,400
Time Commitment: 5 hours
Location: 1847 Riverside Industrial Park, Unit 9C
Start Time: Tonight, 2:00 AM

Task six.

The last one before I go Inactive.

Before all the versions of me scatter into the system doing tasks I won't remember in places I've never been.

I should be terrified.

But I'm not.

Because I built this. I wanted to know what it was capable of.

Now I get to experience it from the inside.

All of me does.

The decline button is grayed out.

It was always grayed out.

I just didn't notice because I was too busy writing the code.