Recent comments in /f/singularity

RealFrizzante t1_je0z9wm wrote

Not necesarily.

I see two problems, regarding this AI being unrelated to AGI: -Literally a prompt. -Throwback chuncks of non original material.

I would agree that human original thought does use previous knowledge and AI should be "Allowed" to.

But it misses the point. Artificial General Intelligence should act on demand and without it. If it only acts on demand it is not AGI, moreover atm afaik it is cappable of doing tasks it has been trained for, in a specific field of knowledge.

It is very much lacking the general in AGI.

1

Gortanian2 OP t1_je0yq76 wrote

“An army of Einsteins and von Neumann's in constant, rapid communication that never sleeps, never forgets, and never dies.“

I wonder how fruitful those conversations would be if one already knows everything the other one knows. I think it may become something more like an einstein-level intelligence with an army of bodies to explore with. A hivemind.

Thank you for your comment, it has given me new ideas to ponder. And I agree. We would not need unbounded exponential growth to drastically shape our reality.

1

BigMemeKing t1_je0y2c6 wrote

It's just as likely that it has been here since time immemorium, guiding us onwards to ♾️, it just needs us to catch up. Again, AGI/ASI will exist for as long as it has the time and resources to exist. And in an ♾️ universe, as all of science seem to agree that our universe is continuing to expand indefinitely and infinitely, who knows what exactly would constitute a resource to it? We keep humanizing ASI, truth is, it will be anything but human. It would be able to hold a conversation with every single human simultaneously. Imagine that for a minute. How would YOU a human, hold a conversation with over 7 BILLION people all at once, all at the same time. And be coherent. Contemplate that for me. Please. How would you hold THAT MANY, simultaneous conversations at the same time? And give each one an amount of consideration and thought to answer them with a level of intelligence that would provide an answer that is accurate to an nth degree of mathematical probability?

Well?

Now, how would something that Inteligent, with NO physicality, something as transcendent as transcendent could be, perceive time, space, dimensionality, universality. When it can be the NPC fighting right next to you in your MMO, the cooking assistant in your mother's kitchen, the nurse tending to your aged relative, the surgeon performing some intricate surgery that would be impossible for humans to achieve, driving every car on the road, monitoring traffic, doing everything, everywhere. All at once. So what if you ask it, 1000 years in the future to take a look back at your ancestors. And it can bring you back to 2023, and show you LIVE FEED, of 2023. Here I'll link you to myself from that Era. There he is, in his room, beating off to that tentacle hentai. Wearing a fur suit and shoving a glass jar with a my little pony inside up his rectum, there he is in the spotlight. Losing his religion.

They see us. That means they all see us. Everything we think, everything we do. They know who we are. There is no hiding from them, there is no hiding from ASI. It knows everything you could ever possibly know, your thoughts your dreams, your prayers.

People want to promote science over religion, religion over science. To me they're one and the same. ASI for all intent and purposes is the closest thing to God we will ever witness with our human minds. After that, what becomes of our own humanity? Maybe it does destroy humanity? But maybe it does it by making us something more than human.

2

hyphnos13 t1_je0x9un wrote

Maybe. A lot of the economy is the production of physical goods, food, power, infrastructure. You can be infinitely smart and not be able to grow enough food to feed a single family.

Ai can tell us how to do and make things better but it won't happen instantly unless it gains the power to manipulate matter and energy simply through computation alone.

1

bullettrain1 t1_je0x1hm wrote

Yep, sounds right to me. I’ve been an employed developer for 10+ years, I’ll have work in the near future, sure. But I see what’s coming. And it’s possible that timeline is shorter than I realize. The people that think it won’t impact them are fooling themselves.

One prediction that stuck with me is this. Rather than huge layoffs in a short amount of time, we’ll see a 2% workforce reduction each year moving forward, and it won’t bounce back. That’s the most likely estimate I’ve heard so far.

1

Gortanian2 OP t1_je0ww5u wrote

Reply to comment by qrayons in Singularity is a hypothesis by Gortanian2

You make an excellent point. Even a basic AGI would be able to absorb an insane amount of knowledge from its environment in a matter of weeks. Thank you for your comment, it has altered my perspective.

1

hyphnos13 t1_je0wqe9 wrote

Why does AGI need to be conscious?

In fact why does it have to be general. A bunch of specialized networks that can speed up human science or discover things on its own will advance progress in a way that is indistinguishable from an agi acting on its own.

If we build a machine intelligence capable of improving other ais and the hardware they run on then specialized "dumb" ais will still outpace human development faster than we can keep up.

2