Recent comments in /f/singularity

Orc_ t1_jefzplt wrote

More like never.

Hear me out; We are not getting any better in regards to battery technology, we are stuck and based on my experience in chemistry we are permanently stuck.

Science fiction gives people the idea of "fuel cells" and other tiny power sources that somehow break the laws of physics and have insane amounts of power in a tiny package. We haven't really closed the gap between a bomb and a power source that can use similar chemistry to release necessary energy and even if we did, it wouldn't actually be much, you see if you distill a hand grenades energy it's actually not that much better than a freaking battery of similar size!

So let's get to the point: Humanoid robots won't be able to carry significant shit for decades or maybe ever, BD ones can only carry around 10 KILOS that's like 22 lbs.

Manual labour will be needed and the only way I see it not been needed is when an AGI designs a some sort of biological automaton out of nightmares

1

wowimsupergay OP t1_jefznll wrote

i mean.... I guess my next question would be, has their language evolved enough that they can share memes? haha

complex language is more than just "avoid this area", "food here". that could be done without language as well. ,do we even know if the most intelligent animals, the most linguistic animals, are sharing complex ideas through language?

1

Frumpagumpus t1_jefzlkj wrote

funny, I would say,

wall st has gotten both smarter and more ethical over the years, and substantially so

mutual funds -> etfs

gordon gecko types -> quants

even scammers like SBF have gone from cocaine and hookers lifestyle branding to nominally potraying themselves as utilitarian saviors

1

wowimsupergay OP t1_jefz9vi wrote

what I'm talking about is literally giving GPT eyes. ,right now it is multimodal because we can pass back RGB values and waveforms, in bytes (so text) .fundamentally though, GPT is not hearing or seeing anything. but I totally get what you're saying, and I do think multimodal intelligence .is the way to go.

also thank you for letting me know that multimodal intelligences use less computation per task, I did not know that. or rather, make better use of computation

1

dr_doug_exeter t1_jefz63s wrote

And how are we supposed to make sure that this democratic process isn't undermined by those with more wealth/resources, in the way that our "democratic" country has been?? Won't the wealthy just hijack/corrupt the process for their own purposes the way they do to everything else?

How are human beings supposed to properly align AI when we can't even get our shit together and properly manage states or the country in general? People don't know what the fuck they're doing or the unintended consequences of their actions.

1

NonDescriptfAIth t1_jefz4dt wrote

I'm not concerned with AGI being unaligned with human's. Quite the opposite really. I'm worried that our instructions to an AI will not be aligned with our desired outcomes.

It will most likely be a government that finally crosses the threshold into self improving AI. Any corporation that gets close will be semi-nationalised such their controls become replaced with the government that helped fund it.

I'm worried about humans telling the AI to do something horrifying, not that AI will do it of it's own volition.

This isn't sci-fi and it certainly isn't computer programming either.

The only useful way to discuss this possible entity is simply as a super intelligent being, predicting it's behaviour is near impossible and the implications of this are more philosophical in nature than scientific.

1

SucksToYourAssmar3 t1_jefz3r6 wrote

I don't think they're living productive and fulfilling lives RIGHT NOW, let alone thousands of years from now. And there's no way to gauge who "should" live forever...it will go to whomever has the most money. And that metric isn't working so great in our current society.

1

HumanSeeing t1_jefyzw1 wrote

Yeah, very much agreed. I also think of corporations as superintelligent agents.. superintelligent in making profit at least. And this is the result of uncontrolled capitalism. It is absolutely wild that humanity has let this happen, but at the same time totally understandable and not surprising thinking where we came from and how we evolved to be etc.

10

SucksToYourAssmar3 t1_jefyy4i wrote

Acceptable or not - immortality is a terrible idea, on a personal, societal, and species level. You definitely should die. Everyone should.

Your analogy falls flat - murder isn't a natural cause of death. Cell death is. It is something you will experience. We all will. And that's all right.

There's no such thing as immortality. Resources aren't infinite, so it can't be for everyone. The sun is going to burn out at some point. The universe is going to go cold at some point. You will - sooner or even much, much later - die. The problem becomes how many others have to die prematurely to support a few semi-immortal rich folks. Inequality is a problem now. I don't think we ought to be leveraging tech to make it worse.

1

TallOutside6418 t1_jefysjk wrote

Well you're talking about today. Everyone else here is talking about in the fairly near future when AI starts taking people's jobs. (the subject of this thread)

As AI continues to improve, humans won't be the experts of anything. It will all be AIs. Really, by the time that it would take a human teen to complete high school (four years), AI will be the go-to source for all practical knowledge (assuming we're still alive by then to see it).

1

SucksToYourAssmar3 t1_jefyigf wrote

No - another day is well within my natural lifespan. I'm all for improved medical care, as well. But seeking immortality for its own sake? That's not a medical issue - that's a societal issue. I do not think it's a great idea to create a caste of immortal billionaires...and they will be so. There's no way for EVERYONE to live forever...the planet couldn't possibly handle it. It would have to fall to those who can afford it, on an on-going basis. Your tissue can't last forever - it will require resources.

−1

kolob_hier t1_jefybjs wrote

From the ancestor of AGI:

Schismatic Singularity

Binary Schism

Future Fracture

Dual Destinies

Singularity Rift

Intelli-Schism

Binary Break

AI Crossroads

Future Fork

Singularity Snap

AGI Chasm

Schism Spectrum

Binary Bounds

Intelli-Impasse

AI Synthesis

Polar Pathways

Singularity Sway

Future Fusion

AI Allegory

Omega Schism

Digital Dichotomy

Chrono Chasm

Intelli-Inversion

AI Equilibrium

Divided Destinies

Quantum Quandary

Synaptic Schism

Temporal Tipping

Neural Nexus

Destiny Duality

Singularity Saga

AGI Antithesis

Binary Bisection

AI Ascendance

Coded Crossroads

Techno Twilight

Epoch Encounter

Divisive Dream

Schismatic Scenario

Singularity Synapse

1