You are the president of Neuralink. Neuralink is a neurotechnology company that is developing implantable brain-machine interfaces (iBMIs).
BMIs allow for human brains to directly interact with machines. Neuralink's goal is to implant computers into human heads, connected to their brain with an iBMI. This will allow for expansion of human cognitive abilities, allowing people to make more accurate calculations, obtain immediate access to knowledge, and remember things more clearly.
Think of all the ways one might use a smartphone. Instead of pulling out the calculator app, Neuralink seeks to allow you to do the same calculations more swiftly and just as accurately in your brain. Instead of taking pictures, you could directly access memories of that event with perfect clarity. Even google could be accessed with a single thought, similarly to how Google Assistant can search things for you with the simple phrase "Okay Google."
You will have to make decisions that will determine the shape of Neuralink's iBMI project.
[[The First Decision]]Neuralink has a great capacity for change. Your researchers have found a way to not just effect the cognitive parts of the brain, but the more conscious parts as well.
The decision that lays before you is whether or not Neuralink should pursue mood regulation as part of the final device. This could have great benefits for people with mood disorders, allowing the iBMI chip to stabilize emotions rather than the current medication centered method of dealing with that issue. Abound 7.1% of the US population suffers from a major depressive episode each year, and Neuralink could help prevent that.
Of course, that would require more funding and ultimately raise the price of the final product. It would also make people less willing to purchase the chip, as they may not trust a computer to mess with their emotions.
[[Approve Emotion-Regulation]]
[[Deny Emotion-Regulation]]
You have approved emotion regulation. However, its cost has been noted by your shareholders.
They've realized however, that this could be used to effect things that aren't strictly... beneficial to the user. You could give someone a craving for, as an example, Coca-Cola. In return, Coca-Cola would pay a premium back just like a normal advertisement.
This could easily cover the increased costs of the research, and make both the shareholders and yourself quite a bit of money.
[[Accept Mood-Advertisement]]
[[Deny Mood-Advertisement]]Mood regulation seems a little too invasive, don't they?
Your shareholders have noted a major source of income however:
The advertisement industry is massive.
They propose to tap that industry: display advertisements in people's brains.
The benefits to this that you could greatly drop the sale price of the iBMIs, making them far more accessible.
The downside of course, is that no one likes ads. But... An extra fee, possibly even a subscription could remove advertisements while still allowing your technology to get into the heads of people that couldn't afford it before.
[Approve advertisement]<c1| (click: ?c1)[set: $ads="allow"), (go-to: "Approve advertisement")]
[Deny advertisement]<c2| (click: ?c2)[set: $ads="deny"), (go-to: "Deny advertisement")](set: adv to 'mood')
These Mood-Advertisements should boost profit margins significantly...
However, your shareholders have pointed out that you can do even more.
If you can change moods, and apply stimuli, you could induce addictive responses to simple placeboes. You can create artificial drugs, at which Neuralink would have a monopoly over.
This could create unprecedented amounts of profit, rivaled only by the old East India company's opium trade back in the 18th century. Of course, creating addictions has major ethical concerns... But Really, it's not that different from what you've already done.
[[approve induced hedonism]]
[[deny induced hedonism]]Affecting people's behavior for nothing more than profit does raise more than a few ethical questions, doesn't it?
But what about changing people's behavior to be more ethical? Your researchers have discovered how to interact with a person's conscience and moral reasoning. The benefits of this, is a morally better society. This could decrease crime of all sorts, lead to more philanthropy, and just nicer people in general.
The downsides of course, are largely the same as for Emotion-Regulation; Higher cost of research, and less public trust.
[[Approve Moral-Straightening]]
[[Deny Moral-Straightening]](set: adv to "none")
Improving morals has been something philosophers have been trying to do for centuries, and you've finally done it.
The CIA wants to patron your work. They believe it would be a vital asset in protecting the country from domestic terrorists. All they want is a backdoor, just like what the government is trying to get in other devices like phones and computers. In return, they offer public funding.
The benefits are of course giving the government an interrogations method that is not only much less traumatic than other methods, but 100% reliable... You just need to download memories rather than take their word for it. This means the government could shut down any potential domestic terrorist before they became a threat, with a near-zero chance of convicting an innocent.
Of course, this is a massive breach of privacy. Not only that, but if you open a backdoor; you open up a potential security breach that's just waiting for a hacker to figure it out. But surely national security is more important?
[[approve government intervention]]
[[deny government intervention]]
(set: $ads to "mood")(set: $data to "mood")
No. To change people's moral fiber, is to change their decision making... to who they are... That's too much. If people want to be better, than can choose to be so themselves. To force it upon them would be unethical.
Now all that's left before release is figuring out precisely how you want to brand Neuralink's iBMIs. You have the option of putting a steep markup on your invention, confirming it as a luxury and a status symbol.
The benefits to this, is that it would make your shareholders very, very happy. Entrenching this as a luxury would additionally make people with money much more interested in the research, allowing you to make newer and more advanced versions similar to overpriced smartphones.
The downside of course, is that it unfairly benefits people with money if they can become transhuman but the common man cannot... but maybe you can reduce the price after you release a newer model.
[[High prices]]
[[Low prices]]It's release day at Neuralink, and you've unleashed Pandora's box upon humanity. By giving the CIA access to a device that interacts directly with the brain, you have given them an intelligence agency's dream... total intelligence. Memory storage was part of the initial goal of Neuralink, and you've handed the collective stored memories of all of your new clients to the government.
No one knows this of course. It's a trojan horse into their deepest thoughts... Why would the CIA allow this information to get out? If any of the people that do know try and say anything... well, you all have agents watching your every move.
Bah, but what does any of the ethical and privacy concerns matter? You are filthy rich and not inside of a jail cell. Additionally, you've advanced the bounds of human brain research far more than anyone else in human history. You may even win a Nobel prize... and all it costed was other people's privacy. And besides, if they aren't doing anything bad, they won't need that anyway.
The Dystopian Ending.Neuralink has been shut down by the government. It's over.
You might have saved the public's freedom but lost your own in the process. You now have a life sentence in a federal prison, never to see a lab again.
Additionally, your work has been vilified by the press, setting back public opinion of your research by decades. If the goal was transhumanism, you have done brought it farther from fruition, not closer.
You could have done so much, but you got greedy and then only found your spine when it was too late.
You are a failure, in every sense of the word. And you have the rest of your life to contemplate that.
The Bad Ending.The CIA has taken notice of your efforts and has come to make you an offer you can't refuse.
On one hand, you're about to create an artificial addiction; a second opioid epidemic. The CIA can and will shut you down, ruining all of Neuralink's research and development and landing yourself in prison.
On the other hand, the CIA is offering to become Neuralink's patron. All you need do, is give them a little control over the iBMIs. A little emotion-bombing here, to take down potential domestic terrorists. A little memory searching there, to identify terrorists. The CIA encourages you to 'Do the right thing'.
[['Do the right thing']]
[[Accept your fate]](set: adv to "mood")
No, this has gone far enough. Creating a drug epidemic would be incredibly illegal. Mood Advertisements push the line more than enough.
The CIA wants to patron your work. They believe it would be a vital asset in protecting the country from domestic terrorists. All they want is a backdoor, just like what the government is trying to get in other devices like phones and computers. In return, they offer public funding.
The benefits are of course giving the government an interrogations method that is not only much less traumatic than other methods, but 100% reliable... You just need to download memories rather than take their word for it. This means the government could shut down any potential domestic terrorist before they became a threat, with a near-zero chance of convicting an innocent.
Of course, this is a massive breach of privacy. Not only that, but if you open a backdoor; you open up a potential security breach that's just waiting for a hacker to figure it out. But surely national security is more important?
[[approve government intervention]]
[[deny government intervention]]
Cheaper prices should equal out to more sales.
Of course, there is another method of decreasing prices than just advertisements. Data is everything.
Facebook and Google, to name a few, both deal with data to target advertisements. Data is also useful for a great deal of other things, such as figuring out sales trends and a number of other things.
The benefit is that you can greatly drive down the cost of your iBMIs by selling data to other companies. This in turn makes your iBMIs much more accessable.
The downside is largely privacy concerns, but all of the major tech companies are doing this to some extent. Additionally, this wouldn't collect identifying information, just more general trends.
[Approve data collection]<c1| (click: ?c1)[(set: $data="allow"), (go-to: "approve data collection")]
[Deny data collection]<c2| (click: ?c2)[(set: $data="deny"), (go-to: "deny data collection")]Lower prices aren't worth the hassle our users will get from ads.
Of course, there is another method of decreasing prices than just advertisements. Data is everything.
Facebook and Google, to name a few, both deal with data to target advertisements. Data is also useful for a great deal of other things, such as figuring out sales trends and a number of other things.
The benefit is that you can greatly drive down the cost of your iBMIs by selling data to other companies. This in turn makes your iBMIs much more accessible.
The downside is largely privacy concerns, but all of the major tech companies are doing this to some extent. Additionally, this wouldn't collect identifying information, just more general trends.
[Approve data collection]<c1| (click: ?c1)[set: $data="allow"), (go-to: "approve data collection")]
[Deny data collection]<c2| (click: ?c2)[set: $data="deny"), (go-to: "deny data collection")]Lower prices at little to no impact on the users? That's a steal!
Now all that's left before release is figuring out precisely how you want to brand Neuralink's iBMIs. You have the option of putting a steep markup on your invention, confirming it as a luxury and a status symbol. Any previous cost reductions would turn into pure profit.
The benefits to this, is that it would make your shareholders very, very happy. Entrenching this as a luxury would additionally make people with money much more interested in the research, allowing you to make newer and more advanced versions similar to overpriced smartphones.
The downside of course, is that it unfairly benefits people with money if they can become transhuman but the common man cannot... but maybe you can reduce the price after you release a newer model.
[[High prices]]
[[Low prices]]Everyone might be getting into data, but Facebook did get in trouble with the law for it. It's safer to not, and it would help our optics.
Now all that's left before release is figuring out precisely how you want to brand Neuralink's iBMIs. You have the option of putting a steep markup on your invention, confirming it as a luxury and a status symbol. Any previous cost reductions would turn into pure profit.
The benefits to this, is that it would make your shareholders very, very happy. Entrenching this as a luxury would additionally make people with money much more interested in the research, allowing you to make newer and more advanced versions similar to overpriced smartphones.
The downside of course, is that it unfairly benefits people with money if they can become transhuman but the common man cannot... but maybe you can reduce the price after you release a newer model.
[[High prices]]
[[Low prices]]By pricing your product highly, you have ensured the future of your research. The fact that only wealthy can afford an iBMI might even help you in the future, as people will want to do what the celebrities are doing... giving you free advertisements by the time you reduce prices.
The ability to be more than the masses. That is what Neuralink has released. It's release day, and your iBMI has become quite popular with those that can afford it. Anyone with the money can buy a mental advantage over their competitors, and so it would be foolish not to get an iBMI.
(if: $ads is "allow")[The advertisements led to consumer backlash of course. The decision to hike the prices for more profit while trying to reduce the prices via ads is quite unpopular with the elite target demographic. However, they can nearly all afford the price to remove ads leading to very few companys wanting to purchace advertisements rights.](if: $ads is "deny")[The call to not include ads was certainly beneficial to overall profits. Not including ads will increase Neuralink's brand optics, and the majority of the people that can afford a Neuralink iBMI can afford a subscription to opt out of advertisements anyway.](if: $ads is "mood")[The mood correction software lead is quite popular amongst those that need it. Ailments like bipolar disorder may become a thing of the past, and the less prominent ones like depression may silently disappear]
(if: $data is "allow")[Data collection proved... unpopular with a target demographic that can afford the lawers to look over the terms and conditions. High profile people with the money to not have their data sold tend to prefer their privacy.](if: $data is "deny")[Data collection may have proved detremental if our clients had read the terms and conditions. It was wise to not try and sell that data, and it helps build the public's trust in Neuralink.]
Additionally, your research has advanced human understanding of its most vital organ, the brain, far beyond everyone else in memory. You've been nominated for a Nobel prize.
The Corporate Ending.If you price the produce conservatively, then everyone can buy your iBMIs, and that's the real goal isn't it? To improve the cognitive abilities of the human race as a whole, to make sure that it's competitive with AI.
It's launch day for your iBMI. While they aren't selling as rapidly as your shareholder would like, no new technology does. In time however, you could augment the whole human species. As more people get iBMIs, distrust will fall, and sales will grow; but in the meantime, you will have to rely on early adopters.
(if: $ads is "allow")[The advertisements led to consumer backlash of course. People find it much harder to ignore advertisement streamed directly into their minds, but the increased profits from lower prices exceeded the backlash.](if: $ads is "deny")[It may have been wiser to allow advertisements, but your decision to deny them has certainly made Neuralink's clients happier... although there are fewer of them.](if: $ads is "mood")[The mood correction software lead is quite popular amongst those that need it. Ailments like bipolar disorder may become a thing of the past, and the less prominent ones like depression may silently disappear.]
(if: $data is "allow")[The data collection policy Nerualink has implimented proved quite successful. There is a bit of public distrust about what exactly we are doing with that data, but even Facebook hasn't sank yet.](if: $data is "deny")[Had you chosen to impliment data collection, it would have most likely gone largely undetected. Your target demographic can't afford to take Neuralink to court, and the decreased price would have lead to more sales.]
Additionally, your research has advanced human understanding of its most vital organ, the brain, far beyond everyone else in memory. You've been nominated for a Nobel prize.
The Transhumanist Ending.You've allowed the government access to the people's most private thoughts... so long as they have a warrant anyway. Where this direction of iBMIs go however, is entirely out of your control.
You may have made some ethically sound choices, and some not-so ethically sound choices. You've advanced human understanding of its most vital organ, the brain, far beyond everyone else in memory. You've been nominated for a Nobel prize.
On top of all this, you've made yourself a pretty penny. You can rest easy knowing you did your job, and did it well...
So long as you can silence the thoughts of the decreased security. And what could the CIA do, beyond what you've given them? Surely, they wouldn't try and restart Project MKUltra with the new toy you've given them.
The "Patriot" Ending.No. Absolutely not. The amount of power allowing someone else to directly access people's brains is unthinkable... It cannot be allowed for anyone to do so but Neuralink; the potential for abuse is too large, and you will not have that on your hands.
Launch day is soon, and once it's here (if: adv = "mood")[you will have made both yourself and Neuralink a massive amount of money. You may have given people a few unwanted urges, but that will easily pay for whatever research you have plan for next.](if: adv = "none")[You will have created a morally better society, with emotional disorders being a thing of the past. Some may complain about free will, but you've created the start of a utopia.]
Additionally, your research has advanced human understanding of its most vital organ, the brain, far beyond everyone else in memory. You've been nominated for a Nobel prize.
The Optimistic Ending.
<script>
function EmbedTwineUpdateHeight(){
var passage = document.getElementsByTagName("tw-passage")[0];
if (passage === undefined){//SugarCube
passage = document.getElementById("passages");
}
var newHeight = passage.offsetHeight;
if(newHeight<500){newHeight=500;}
window.parent.postMessage(["setHeight", newHeight], "*");
console.log(newHeight);
}
setTimeout(EmbedTwineUpdateHeight, 50);
</script>