Skip to main content


reshared this

in reply to Wayne Radinsky

“I believe that the conventional idea of ‘writing a program’ is headed for extinction, and indeed, for all but very specialized applications, most software, as we know it, will be replaced by AI systems that are trained rather than programmed.”

Who is going to create those AI systems? Programmer Analysts?

in reply to Wayne Radinsky

in reply to Wayne Radinsky

I simply do not see the mechanical aspects of this. Major parts of system development depend upon the design phase. It's conceivable that the design phase will move more directly into testing -- but that does not eliminate the importance of design.
in reply to Wayne Radinsky

It's tempting to imagine replacing coders, but without any proficiency in maintenance and upkeep, how will we prevent information decay? We could wake up one day with failing programs and no AI can fix it because the data sets are flawed and no one can fix them. AI should uplift us mentally, meaning we understand more thanks to it, not less.
in reply to Wayne Radinsky

He's a blithering nincompoop, apparently.

A hand-coded program will always be more efficient and effective than an entire goddamn AI.

What the hell is he thinking?
Does he know anything about any of this?

in reply to Wayne Radinsky

"For example as people become smarter with the use of computers" - interesting, I have been thinking lately that people are getting dumber and dumber using computers... As a consequence of the aforementioned ladder of abstraction (and simply human evolution leading to the reversed Flynn effect) and the development of the UI on computers (including smartphones) in general.
in reply to Wayne Radinsky

People at very least become deskilled by working with this stuff.
in reply to Wayne Radinsky

Yes, that's one of the main reasons for the reversed Flynn effect. They quote attention deficit and harmful chemicals as well.
in reply to Wayne Radinsky

“For example as people become smarter with the use of computers” - interesting, I have been thinking lately that people are getting dumber and dumber using computers


Yeah, i think i phrased that wrong. I wasn't referring to genetics of brains making people smarter, but rather the combination of brain with the aid of a computer makes a person smarter. But I do agree that growing dependence on computers could possibly result in reduced brain smartness, though I kinda doubt it. Who knows.

As interesting as the flynn effect studies are, I'm not sure you can depend too much on their results. Pretty small sample sizes over a relatively short period of time:

Flynn effect and its reversal are both environmentally caused
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6042097/

in reply to Wayne Radinsky

One thing I would caution about the role of AI and computers. We tend to think of them competing with humans and their brains. But that isn't necessarily the main change that is happening. Rather, what may be happening is that the context is changing. Computers and AI might bring new and different ways of doing things. So, it's the new context that humans will deal with that is at issue.
in reply to Wayne Radinsky

AI output is derivative.
AI can't do anything a human can't do better, faster, for less.
in reply to Wayne Radinsky

One thing it can do is image recognition - which is going to be very useful for blind people. Phones will be able to run the model and describe whatever it's pointed at without needing a network connection.
in reply to Wayne Radinsky

Yes of course a human aide can also do that, but having the freedom to manage without one will be an important option.
in reply to Wayne Radinsky

A human can do that better.

When people say "AI is making this possible" they really mean "This is stuff so worthless to us, that we won't hire a person to do it".

That is literally all it means.

And when coding jobs are axed because of AI, it is still the same message "This work is so worthless to us, that now that we can get not-a-person to do it, y'all are out of here".

They don't care about the quality. They foolishly believe the AI will improve, despite eating its own diarrhetic shit from here on out.

in reply to Wayne Radinsky

the combination of brain with the aid of a computer makes a person smarter

I'm not sure this is a notion that is generally applicable -- like in many things, real life scenarios are complex. "SQL" certainly made people vastly more productive, and the use of SQL instilled a good practical understanding of Set Theory logic. But, once you move up the ladder of abstraction from this already very abstract database interface, you are bound to loose a great deal of understanding of what is is you are doing. That's not smart.

in reply to Wayne Radinsky

Tom, no that's not quite right. SQL is a good example. Before relational database theory, people were guessing how to organize database schemas. With a theory of "normalization", technical people began to organize data differently, systematically around the idea of "normal forms". Similarly with the advent of object orientation, tech people began to organize procedures around the idea of message passing. Functional programming, taught people how to control and manage "side effects." These techniques made people smarter about representing data and procedural organization, and made a huge difference in simplifying programs. And btw not to be too pedantic about it, relational theory is based on relational algebra, but i'd have dig through the cobwebs of my mind to say how that is more than set theory logic.
in reply to Wayne Radinsky

Tom, ok you meant set operations such as join, union, intersection, etc. So the SQL abstractions are: selected item, from sources, set operations on item sources, selection condition. Thus, rather than writing procedural code to query the database, you just declare desired value of these 4 parameters. This works because the underlying standard query engine code only has to be written once for everyone. So, the relational database idea is a good example of computer inspired abstraction. (But this also required another abstraction - functional dependency, which is achieved by normalizing the data)
in reply to Wayne Radinsky

So, the relational database idea is a good example of computer inspired abstraction.

This was my initial point. You are missing what followed. If we move upward from this level of abstraction, you loose all of this: set operations such as join, union, intersection, etc. So the SQL abstractions are: selected item, from sources, set operations on item sources, selection condition.

These are the basic operations, but it can get significantly more complicated from these basic operations as well. Who is going to be able to cognate what they need to do to get what they want when they have no basic understanding? They won't have the mental machinery in place, nor the symbols to employ.

in reply to Wayne Radinsky

... I guess I am positing that there are "ideal" levels of abstraction to work within for various kinds of problems. Once we move away from that level, we lose a certain kind of understanding - and most likely, some portion of our ability to solve the problem at hand.
in reply to Wayne Radinsky

in reply to Wayne Radinsky

But we can't afford to keep it powered right now, and our budget for energy use keeps shrinking.

If it is smothered in the crib, none of that will happen.

in reply to Wayne Radinsky

which tech is going to win the battle for computing resources (including energy): (a) so-called AI; (b) crypto-currencies?
in reply to Wayne Radinsky

AI will design software, write software, and do all the maintenance and upkeep. But it’s foolish to try to predict when. AI seems to have been racing ahead in the last few years

AI is like an invasive weed in a plowed field. The "crop" is at a disadvantage. We have got to find our area of superiority, or we will get overgrown.

in reply to Wayne Radinsky

While “musician” is totally in the crosshairs of AI, as we see, that applies only to musicians who make recorded music – going “live” may be a way to escape the automation. No robots with the manual dexterity to play physical guitars, violins, etc, appear to be on the horizon. Maybe they can play drums?

I'm still not sure of this. The machines can "make" music, but can they make it meaningful to humans... you know, something like "Pinball Wizard".

in reply to Wayne Radinsky

The machine would somehow have to "know" what is meaningful and what is not.
in reply to Wayne Radinsky

Some people sneer at AI gobbling up resources. Ha, look at energy consumption around the world.

The human brain uses about 25 watts of electricity i'm told, and that's because it's very miniaturized. And there are a few billion people times 25 watts. So, people gobble up quite a bit of energy too. Won't be long before computers start miniaturizing too.

All this worry and sneering about computers is probably a bit misplaced. Like Wayne says, it's pretty hard to predict the future. I'm reminded of the story about the farmer driving his horse drawn wagon and passing a new fangled model A ford stranded by the side of the road with a flat tire, and the farmer laughs as he passes and says, "get a horse."

in reply to Wayne Radinsky

The human brain uses about 25 watts of electricity i’m told, and that’s because it’s very miniaturized.

Yet humans are extremely wasteful of "brainpower" or potential brainpower. Look what a crow can do, using about two orders of magnitude less.

in reply to Wayne Radinsky

The brain is more or less optimized for adaptability to an environment to survive long enough to reproduce itself. This organic process has created vast diversity over the eons. Computers just happen to be added to mix at this point in the ongoing creation of diversity.
in reply to Wayne Radinsky

Yes, but will their "evolution" be more or less successful?
in reply to Wayne Radinsky

The brain uses exactly 0W of electricity.’

If you fail to understand that hard fact, you’re going to make bad decisions, like this fella :

https://newrepublic.com/article/180487/balaji-srinivasan-network-state-plutocrat

in reply to Wayne Radinsky

It is always overlooked just how utterly stupid tech execs are.
They are worse than pond scum.
in reply to Wayne Radinsky

Andreas, that seems to be important to you, but it isn't clear what your point is.
in reply to Wayne Radinsky

Like our muscles, our brains consume a lot of energy every day—approximately 20 watts to maintain spontaneous neuronal activity.
in reply to Wayne Radinsky

And exactly zero Watts is electricity.

Do you understand basic biology?

in reply to Wayne Radinsky

What is your point? You eat food, metabolize it, and the brain consumes part of the energy. You don't like the use of watts as a casual conversational term? Suggest a better way then.
in reply to Wayne Radinsky

I was pointing out that you were spouting gibberish.

A supercomputer emulating a brain 100th the size of an ants is a completely different sort of problem than 1000 humans eating a sandwich.

AI is a wasteful and stupid technology, and we cannot afford to use any electricity on it, because we are already consuming more electricity than is sustainable.

Handwaving about "miniaturization" just underscores how poor your understanding of any of this is.

A brain has trillions of neurons.

The largest AI has the equivalent of maybe 10000 neurons, and uses MW of energy to run it all.

That has nothing to do with "miniaturization".

Actual neurons are just magnitudes more efficient than their emulations, and there is no reason to think this will ever change.

in reply to Wayne Radinsky

in reply to Wayne Radinsky

Also, keep in mind that current AI costs are mostly developing the foundation models. Once that is done, the cost and energy consumption of actually using AI is very low. The next few generations of the iphone will do amazing intelligence on the phone - on the phone - get it?
in reply to Wayne Radinsky

Intelligence can be thought of as information compression. Much of human intelligence is a result of culture having encoded a compressed scheme of reality in such a way that people can learn the shared encoding scheme. Similarly with AI. Moral of the story is that people who mock and sneer at AI for being overhyped and too resource intensive might not understand what the bigger picture is.

True, in some cases "get a horse" might be the answer, but evolution has many tricks up its sleeve, and one should not keep all their eggs in one basket of contempt. So to speak.

in reply to Wayne Radinsky

I'm not sure how we got on the relevance of miniaturization, but
looking at the numbers with a current "system on a chip", it seems processors are beginning to scale up to brain size. That sure does put a spin on what we think of as classical programming. Just imagine you are a programmer and you have a computer at your disposal the size of current mobile phone computers. Billions of transistors at Trillions of cycles per second! Kinda boggles the mind:

The A17 Pro boasts 19 billion transistors and a 6-core CPU, with two high-performance cores ..., and four high-efficiency cores.

The 16-core neural engine can process up to 35 trillion operations per second, .... There are also additional dedicated engines for [graphics].

in reply to Wayne Radinsky

The 16-core neural engine can process up to 35 trillion operations per second, … There are also additional dedicated engines for [graphics].

One of the fascinating thing about the brain is that it functions in a variety of modalities, some of which are happening in parallel. Then there is the hybrid digital/analog nature of the brain as well. We are still only understanding the surface of how all this actually works.

in reply to Wayne Radinsky

exactly zero Watts is electricity

Technically there is an amazing amount of actual electrical activity happening in the brain. Any movement of ions across a gradient can be termed “electrical activity”. More specifically, neuronal action potentials are formally considered “electrical activity”, at least by the definition that the NIH uses.

https://www.ncbi.nlm.nih.gov/books/NBK546639/

in reply to Wayne Radinsky

The brain does generate electrical potentials, yes. That's not the same as consuming electricity, and very much not the same as drawing power from the grid, which is what would be required for the stupid comparison the tech-lords are trying to float right now.
in reply to Wayne Radinsky

FFS, AI hype-jockeys are always trying to dishonestly exaggerate their bullshit.

Like, six months ago they were really desperately trying to compare Parameters to Neurons.

Now apparently it's comparing transistors to neurons.

Both are completely false comparisons.

The only reasonable comparison is between Layers and Neurons.

And even then, it's clear than one layer is less powerful than a neuron, although whether it's 7 times or 70 times, we don't really know.

Bottom line: As I said, the heaviest AI is not even a tenth the power of the brain of an ant.
In other news, no, LLMs do not understand the bullshit they spout.

And in yet other news, AI providers are downgrading expectations across the board, because this absolute bullshit technology is only generating profits for the hardware manufacturers, and that too will not keep up once the bubble bursts.

in reply to Wayne Radinsky

"LLMs do not understand the bullshit they spout."

LLM's have met their match here.

"the heaviest AI is not even a tenth the power of the brain of an ant."

Are you measuring ant power in watts? Lol.