RLP's Hobbies Blog

Free Story Ideas

Article Info

Recent Changes

About


These ideas are completely free for any and all uses, although if you try to sue someone else for developing ideas you found here I’ll try to fight that.

Anything in this document is free to use with no licenses are royalties or anything. However I would appreciate if you let me know if you do something interesting with these ideas.

  • hyper competent wizard solves problems, but she is a tiny old Chinese woman who bases her success on underestimation. maybe she fronts as someone’s sidekick or housekeeper
  • I’d like to reverse the usual “last of his” kind trope, like someone is the last of his kind but then he goes out and fixes it, like that is his quest: to actually solve the fucking problem instead of give up the way everybody seems to.
  • (not in my usual mode) someone is running a pawn shop with the goal of sort of being a real-life super hero: trying to actually solve the underlying problems in the lives of eir clients
  • what modifications to humans would be required to make http://en.wikipedia.org/wiki/Gun_Kata actually possible? What side effects would such modifications, and the ability to make them, have on the culture in question?
  • So in the original mythology Hercules (Herakles) is a very stupid, sick, murderous motherfucker that somehow ends up looking good to the greeks in the end. What I want is a story that assumes that there was a historical personage there, and tells the story of this super-strong insane motherfucker … and the smart little friend of his who follows him around and spins all the fuckups. Because that’s the only way I can see how the Herakles legends look the way they do, and I totally wanna meet his sidekick. “Uhhh…. Well, as I already told you, he’s a child of Zeus, and we know how jealous Hera, gets. Yeah, that’s it, Hera made him do it! Yeah…”
  • SF with hyperspace but instead of hyperspace being totally empty (basically every media ever) or with occasional weird pockets of life (later Star Trek), it’s full of life. Like, way way fuller than our normal universe, such that every dip into hyperspace risks murdering the shit out of a bunch of people, like if your car suddenly appeared in Times Square on a random Tuesday. So how and when you can enter hyperspace becomes like this huge political agreement thing with the species that inhabit it, who by the way are confused as to why any of us want to say in such a hostile universe in the first place.
  • You know those brain fungi that infect ants? https://en.wikipedia.org/wiki/Ophiocordyceps_unilateralis ? So what if those are actually not rare, it’s just that the fungus ends up in this bizarre symbiosis with the brain such that the higher the intelligence of the host, the higher the effective intelligence of the fungus, and the reason that we haven’t noticed is that humans literally can’t see mammalian brain funguses? Take it further: the human form of the fungus is sentient, and reproduces only in dead humans. Ideally warm dead humans. So not only are they responsible for our current burial traditions, they are responsible for basically every over-reaching human insanity that has led to mega-death. It turns out that humans are quite content to stop when they’re basically comfortable with life; the urge to keep pushing until you end up with WW2 is entirely the fungus. Bonus points for setting up an anti-fungus of some kind that explains cremation; maybe secret human society that knows about the fungus but has no idea what it actually is, just sees it as evil?
  • This isn’t really a story so much as a general book idea, but: “Shit You Don’t Know”, a book based on http://lesswrong.com/lw/kj/no_one_knows_what_science_doesnt_know/ more or less. It would just be a sequence of things like: There is a standing pool of genetic variation with regards to your immune system, i.e. there are a variety of options for the exact make and model of your immune system, but the number of options is neither very small (like 2) nor very large (like millions) [reference]. One’s immune configuration along this axis is detectable by smell [reference] and people (women in particular) do, in fact detect it unconsciously [reference] and tend to prefer partners who share dissimilar immune configurations [reference], presumably because in the AE any one infection was less likely to wipe out the entire family.
  • what if early religious texts actually did have scientific secrets in them?
  • All of humanity is simultaneously transported to the future by some amount, say 10 years. None of the infrastructure is, though. It would start with a lot of dying, obviously.
  • Xian Xia novel like 40 Milleniums Of Cultivation, but with a strong focus on mental/Admin cultivators, and in particular their abilities to influence and coerce. I’m thinking of this like a cross between the Bene Gesserit and Worm’s Simurgh. So, like, it’s illegal for a high enough level mental cultivator to speak to anyone of a lower level, ever, without explicit permission in advance. Because they can decode your mind so thoroughly that they can essentially get you to do whatever they want by saying apparently arbitrary, harmless things. There would be whole protocols around, like, bringing cannon fodder to meetings so that the mental cultivator across the table has someone to talk through. The cannon fodder’s job is to rephrase what the cultivator says in their own words, but as soon as they do so they are assumed to be compromised. That sort of thing.
  • Space Opera, sort of: it turns out that the reason for the Fermi paradox is that humans are amazingly stupid. Most civilizations get from horse-and-buggy to their ultimate end within 2-3 hundred years. The end could be grey goo or nukes or becoming a pure upload civilization or any number of other things, but the speed of progress and innovation ensures that almost all civilizations fail catastrophically, and the few that don’t see the distance between stars as absurdly vast and pointless to bother with; when your entire civilization is linked by networking whose latency almost never rises above the theoretical light-speed lag (up to and including driving a hole through the planet to go faster), sending probes to other stars starts to look really unpleasant. Especially if you think dozens of times faster than humans. So it turns out that human civilation is absurdly long-lived, and that it is far more likely than other civilizations to continue surviving, because of our glacial pace of innovation and (by others’ standards) truly ludicrous aversion to risk. So humans do end up spreading to the stars … and discover that not only are we not alone, but in fact the universe is a seething morass of civilations, with hundreds of them having risen and fallen since human civilizations started, in just the local 50 light year sphere.
  • Mad Max, but after the self-driving car revolution, so sort of a cross between Mad Max and Adeptus Mechanicus: cars are basically treated like cargo cult gods and have to be appeased via various rituals, and the whole story revolves around the control of, I dunno, like a big 18-wheeler or something, and its social and religious implications.
  • It turns out that the answer to the Fermi paradox is that humanity is the only intelligent species that ever developed anything like kindness or cooperation as a large-scale cultural attribute, so literally every other species ends up nuking itself as soon as the technology becomes available. This leads to something like Starfleet, but with the reverse of the Prime Directive: officers are required to intervene in less advanced cultures the instant they are able to do so.
  • Various SF has “humans almost got destroyed because AI and now we have significant limits on what AIs/computers are allowed to do” as a trope. Here’s my question: could you flip that story entirely on its head and still make it appealing adventure fiction? “In the past we allowed governments/militaries/starships to be run by humans, but this was a demonstrably terrible idea. … Oh shit, we just discovered a star empire next door that lets humans run their military and war machines! Oh shit they’re terrifying barbarians we are in serious fucking trouble!!”. To do this and still end up with adventure fiction/space opera (which is what I want), you’d have to have some pretty dumb AI; no hard takeoff, no superintelligence, no hard nanotech, etc, etc. But if you basically wrote it as “we can’t make minds much smarter than humans, but we can make minds that are more conscientious/reliable/moral/etc than humans”, I think it could be pretty fun.