Comments: 24
Kazumy [2020-04-12 00:23:51 +0000 UTC]
👍: 0 ⏩: 1
MentalCrash In reply to theheroofdarkness [2020-04-11 18:43:45 +0000 UTC]
Yeah, but Mercury is kind of a pain to draw, since brown and blue are colours that usually don't blend very well.
👍: 0 ⏩: 1
yttyr [2020-04-10 20:19:13 +0000 UTC]
Technology from 4000 years into the future is pretty likely to be so advanced that it would be basically incomprehensible to us today. Just saying. Compare 4000 years ago with now, 4000 years ago was around the middle of the bronze age and the latest brand new innovation was the chariot (which was invented just a few years before 2000 BC).
👍: 0 ⏩: 2
MentalCrash In reply to yttyr [2020-04-11 18:45:26 +0000 UTC]
But a 4000 years computer would definitely be capable of reproducing today's videogames with ease, computers have indeed gone further that Turing could have expected considering they're less than a century old.
👍: 0 ⏩: 2
yttyr In reply to MentalCrash [2020-04-12 07:15:15 +0000 UTC]
Yes they sure have, so image how much more advanced a computer from the year 6020 would be.
👍: 0 ⏩: 0
SailorDolly In reply to MentalCrash [2020-04-12 04:05:53 +0000 UTC]
Emulating old software is more than just having powerful hardware--you have to know how the original platform worked (or recreate it). For example, a lot of the Apollo Guidance Computer software (50 years old) is unreadable and forgotten today because of the lack of documentation on the software--the code is still around, but because the people who wrote it are mainly dead and didn't write down what it meant, hardly anybody knows how to use it. It's like having a bunch of Egyptian hieroglyphs around without any Rosetta Stone to help you translate them.
👍: 0 ⏩: 1
MentalCrash In reply to SailorDolly [2020-04-12 19:44:16 +0000 UTC]
If it's obsolete code the amount is produced every year doesn't makes it as valuable, it's better to just write new coding and engines instead of preserving obsolete coding written to work with obsolete hardware. When it comes to emulation something like an emulator made thousands of years from now would be fairly simple since you'd just need to set a basic framework for obsolete software to work, you see it with early videogames nowadays, you can emulate them in superior quality than ever the arcades because you dropping the game into an emulation sotware than can reproduce the game optimized as much as you want with the hardware barely even noticing.
👍: 0 ⏩: 1
SailorDolly In reply to MentalCrash [2020-04-12 20:32:08 +0000 UTC]
You could have effectively infinite hardware power and memory, but the issue is that nobody will remember how the software worked or what the code means, which means that you have to invest HUMAN brainpower into figuring out what it's all about, or else reinvent the whole thing from scratch.
👍: 0 ⏩: 1
MentalCrash In reply to SailorDolly [2020-04-13 12:56:33 +0000 UTC]
Archiving has become a breeze since the internet, you don't have to worry about decay as it happened with paper and film, when it comes to software as long as someone has a copy the base for it to be emulated can be reverse enginereed, you don't need the original engine to emulate or remake old software.
👍: 0 ⏩: 0
SomariaMoon In reply to yttyr [2020-04-11 00:02:36 +0000 UTC]
The magic of the skirt includes the ability to comprehend the tech. Actual Mercury was able to use hers really quickly once she got it.
I was thinking of using more realistic terms but I wanted to A. Spare people the nerd speak, and B. Most of the terms I could think up will prolly be outdone in like 50 years at most >.>
👍: 1 ⏩: 3
MentalCrash In reply to SomariaMoon [2020-04-11 18:48:12 +0000 UTC]
Outdone or outdated I guess? That's something kind of funny when it comes to computer technology, I still remember when everyone was baffled at Pendrives, you could carry a whopping 16MBs, more than 10 times what you could put in a Diskette, and they wouldn't break after two uses.
👍: 0 ⏩: 0
yttyr In reply to SomariaMoon [2020-04-11 09:44:34 +0000 UTC]
Alright, cool.
👍: 0 ⏩: 0
SailorDolly In reply to SomariaMoon [2020-04-11 02:17:08 +0000 UTC]
Yeah, who would have thought 40 years ago, when the latest desktop computers had 32 kilobytes of memory and ran at ONE megahertz, that within their own lifetimes they would be buying machines that had 32 gigabytes and run at five gigahertz? As little as a century from now, we may have computers that can directly interface with our brains, bypassing the need for any external input/output devices and letting us talk to the computer far faster than we could speak or type. Four thousand years from now, just describing the technology would be like explaining a nuclear reactor to Archimedes.
👍: 0 ⏩: 1
nothingsp In reply to SailorDolly [2020-04-11 16:14:58 +0000 UTC]
Nah. There's little fundamental difference, architecturally, between the computer you're using to read this post and the IBM 7094 that serenaded Arthur C. Clarke with "Daisy Bell" in 1962, or the DEC PDP-1 on which the first full-fledged video game was developed in 1965. (Multi-core? SIMD extensions? UNIVAC 1108 and ILLIAC IV say hi!) The numbers have just gotten bigger, that's all.
Plus, the increase in circuit density that has given rise to all that growth is constrained in the end by that pesky law of physics that says you can't pack an arbitrarily large amount of stuff into a finite space without eventually creating a black hole, so there's that.
Anyway, Archimedes was the sort of person who would probably be eager to understand the nuclear reactor as soon as he learned about it - and once you clear a few initial hurdles, the basic principles behind it really aren't that complicated. Subatomic physics is oddly redolent of the classical elements, in a way...
The real question is whether, 4,000 years from now, LISP machines will have made a comeback.
👍: 0 ⏩: 1
MentalCrash In reply to nothingsp [2020-04-11 18:54:19 +0000 UTC]
You'd have to be able to explain to Greek philosophers about atoms though, for them even as thinking men it'd probably have sounded like Plato was actually right metaphysics after all. But about computer technology that advances in both storage and processing capacity, there'll always be need for a visible piece of equipment even if in the next few decades everything transferred via stream and your "PC" at home is just a Wi-Fi receptor, but technology can definitely keep on getting smaller while becoming faster for the foreseeable future.
👍: 0 ⏩: 1
nothingsp In reply to MentalCrash [2020-04-11 22:48:55 +0000 UTC]
But the Greek philosophers already pretty much had the concept of atoms; they were the ones who coined the term. Granted, that was one of several competing theories of metaphysics at the time, and it was considerably less than specific compared to the modern model, but it'd serve as a perfectly solid basis from which to build a working explanation of what we know now.
Also, I'm not so convinced about "the foreseeable future," myself...even before we hit that pesky Schwarzschild limit, we'll run into practical limits in terms of the electrical properties of semiconductor materials at ultra-high densities/ultra-small sizes. As I read it, it's still an open question whether anything much below a 3 nm process will be viable.
One way or another, we're eventually going to hit a wall; the only real questions are A. when, and B. what will happen when the gains we've been taking as a given for the last forty or fifty years suddenly dry up...
👍: 0 ⏩: 2
MentalCrash In reply to nothingsp [2020-04-16 18:27:38 +0000 UTC]
Computing technology for the public is being held back though, it's a logical process of profit by setting hardware on a slow increase instead of introducing super advanced parts off the bat. Inventions like computers and the internet were first intended for military technology before slowly being sold to the public. By the time we hit that wall it'd be something we'd have been warned about at least a decade in advance.
👍: 0 ⏩: 0
SailorDolly In reply to nothingsp [2020-04-12 04:00:22 +0000 UTC]
3 nanometers is the limit for our current methods, but there are laboratory experiments that have produced single-molecule components (e.g. a one-molecule switch or an eight-atom memory bit)--we still need to figure out how to assemble billions of them for a few dollars per processor, though.
Anyway, my point was that, given what we have done with computers in just ONE human lifetime, it is likely that what FORTY lifetimes will bring will be so dfferent that a person from today would need to go to school again just to understand how they work.
👍: 0 ⏩: 0
IMMAGONNAGET2 [2020-04-10 17:54:19 +0000 UTC]
Ahhhhhh!!!! ❤️💖💖💖❤️❤️
👍: 0 ⏩: 1