Future Computers Will Be Radically Different (Analog Computing)

Abone ol
görünümler 9 068 892
50% 1 1



5 Ara 2022




Yük bağlantısı.....


Çalma listem
Daha sonra izle
Lavee Yang
Lavee Yang 9 aylar önce
20 years ago, my computer science professor George Davida said analog will make a come back for complex calculation. Looks like he's right!
rad 2 gün önce
oof yourself
TeD van Loon
TeD van Loon 5 gün önce
@Joseph Rogina they are essentially the same, a quantum computer is just a type of analog computer with a few small differences in use. it is kind of like a analog FPGA but then really analog as in that all lins can also be analogly linked and often quantum computers are converted toa digital signal when they read them out. but they are analog computers when looking to the physics behind them. have to go now so can't go much deeper now.
TeD van Loon
TeD van Loon 5 gün önce
@Scoobert McRuppert might be fun/a possibility perhaps. but if I do so then I've gotta make sure to not accidentally describe how the things actually work. unless I really want to but then I would have to really be sure I want to be a schi-fi writer since once I tell how it works, peope will think they came up with it after reading it and trying it out for read despite it actually being real things I designed long ago and included in it. but it would be fun, making a schi-fi series and secretly hiding in it how to actally make such things. that would be a great way to secretly pass information to people who need it.
TeD van Loon
TeD van Loon 5 gün önce
@fergal farrelly this has actually already be done, someone once made a virus for a quantum computer/analog programmable analog cluster pc to mine crypto. wasn't written to well and only ran for short and when quantum computers where just new and quite weak but they made a insane lot with it.
TeD van Loon
TeD van Loon 5 gün önce
@Adahmantium little will happen, I heard a company boast about how they where basically ready to publish their "consumer quantumcomputer" years ago, I don't know if they did, but what they meant with "consumer" actually was multibilion corporations. that however isn't the main problem here but part of it. the main problem is that quantumcomputers just aren't used yet, people need to develop things for them and they work really different from digital computers meaning they have to think different more like a sculptor rather than a programmer, however it still needs a lot of programming stuf so there are really a few people who can somewhat program for them and very few actually do. it first needs to be in the hands of hobbyists before they take of since in general hobbyists are the ones who make the real great changes in such things since corporations tend to just not feel like taking that effort and risk these days, so open source and such or hobby reasearchers tend to develop such things before they actually get used under often different names so most people won't know it was actually developed by hobbysists in open source. that is actually one of the big reasons why IBM let's you use their quantum computers for free. despite their huge costs and most people just trying to use them for random fun or mining or such there also are some hobbyists who use them and they write the software that makes the computers usable in the future andtypically they make it available as open source software.
lordsneed Gün önce
so wait, you can't use the analogue chips to train the network? but that's what uses the power!! Usually just using the trained neural network doesn't require that much computation at all and is pretty fast. that's not very exciting.
Karkess 5 aylar önce
My undergraduate work was actually with a professor who did research in the brain as an analog computer and using neural networks and analog computing as an attempt to achieve super-turing computation. A researcher who's name is worth looking into in all this from my research would be Hava Siegelmann. At the time I understood much less about the problem. My task was essentially to try and prove that analog computation could be modeled with a neural network on a digital computer. Not sure if my comment will be buried or not, but it's an area worth looking into if you're more deeply interested in this problem.
Patrick Fuhrmann
Patrick Fuhrmann 2 gün önce
i started watching some of her talks, theres one from a 2017 convention that is very interesting
Karkess 4 gün önce
@David R. Lentz It sure does, it relates to computing processes beyond the so-called "Turing Limit".
Pen Gu
Pen Gu 5 gün önce
@David R. Lentz David, the joke being, Turing machines were meant to capture all that was capable of human algorithmic think
David R. Lentz
David R. Lentz 5 gün önce
@Pen Gu , please outline the variables and the parameters, and explicate the particulars in the model.
Pen Gu
Pen Gu 5 gün önce
Elephant in the room, if human brains are indeed super-Turing computers then that seems to defeat the Church-Turing thesis no?
Branden Hickling
Branden Hickling 3 gün önce
This gave me such intense chills dude. Amazing.
Elliott Park
Elliott Park 3 gün önce
Well technically everything, at its smallest composition, is on or off, quantumly speaking.
Ian Taylor
Ian Taylor 9 aylar önce
Fun fact, that Graphics Card he’s holding at 18:00 is a Titan xp with an msrp of $1200, he says it draws 100w but it actually draws about 250w, so that tiny chip that only draws 3w is even more impressive
ApplePotato 10 gün önce
It’s is disingenuous to compare the 3w chip to a GPU. The GPU does a lot more than just simple tensor operations. Just the GDDR memory system alone consume like 20w. Floating gate transistor take a lot more power and time to perform a write operation. And it also wears out as you write to it. You will not be able to do any meaningful AI training on the 3w chip. What We should compare it to are the tensor cores on newer Nvidia GPUs. The RTX3090 has 320 Tensor TFLOPs. That is 100 times the performance of the 3w chip at around a total board power of 300w. Meaning that the digital Nvidia tensor are still far more efficient than that 3w chip per TFLOP.
Yashaswi Kulshreshtha
Yashaswi Kulshreshtha 25 gün önce
lol cute little chip
Dajes 28 gün önce
@Justus Karlsson you are either using a small model or inefficiently utilising GPU (bottlenecking it by CPU operations). If you run a big model with efficient data loading pipeline it easily uses 90+% of max power consumption
Muhammad Ikhwan Perwira
It's not apple to apple, you are comparing special purpose digital processor for graphical and special purpose analog processor for Neural Network.
John Doh
John Doh 7 aylar önce
@Matthew They're specialized to hold a value. I don't see how that's going to help. And the underlying systems are still digital. The main advantage was to speed the processing of achieving a weighted value by sampling the power from a memory cell which then gets converted to a digital value. Don't worry the GPU shortage is ending, although the lockdowns due to Covid in China may create another temporary shortage. Right now though there is no shortage of GPUs. Pricing on the other hand is a different topic and there's a lot of change to the economics affecting the price of GPUs coming from Asia to the US, starting with manufacturing costs being about 15 - 20% more than they were the beginning of 2021.
Carter Bentley
Carter Bentley 6 aylar önce
Back in the mid-1960s my uncle, Joseph Grandine, designed a combination analog/digital computer that could optimally combine the two modes to solve complex problems in signal processing and data analysis. He called his computer the Ambilog 200. At that time, digital computing won the day. Now it sounds like he was a few generations ahead of his time.
Hussain Munshi
Hussain Munshi Aylar önce
That's 60 years ahead! Yeah maybe we might have combinations of computers or a medium to link analog and digital computers so that analog specific tasks can be done on analog and results appear in the digital one
Abhash Kumar Singh
Abhash Kumar Singh Aylar önce
@Dino Schachten Can you please paste the links here?
Les Aventures de Gorman
@PREMiERE PARER No he didn't ; his uncle did.
Robert Roudebush
Robert Roudebush 3 aylar önce
@Deal Hunter sad but true.
notahotshot 3 aylar önce
@nullbrin3 FOH
Rehoboth Farm
Rehoboth Farm Gün önce
Digital or analog a computer may be able to identify a chicken but it still has no understanding of what a chicken is. Even if a computer can query wikipedia and regurgitate information about chickens it still has no idea what a chicken is. Computers have no consciousness they have no soul. AI will never be any more or less alive than my toaster.
C Oe
C Oe 3 gün önce
Okay man, the video suggests that these computers are right now an alternative but. They never been gone. The latest analog PC is the Quantum computer. They never stopped using analog computers nor did they stop evolving these. Nothing New for straight 100 years
Mike Yancey
Mike Yancey 5 aylar önce
I was a finalist in the state science fair competition back in the 4th grade. So around 1977. My project was a board about 2 foot by three foot, full of 2 and three position switches and colored lights. It was a logic board that could solve various types of equations. Pretty cool in a time when almost no one had ever touched an actual computer. In the end I learned absolutely nothing about computers. But I learned to solder really, really well from it. Moral of the story, if you can't learn to code, at least learn to solder. :)
no 10 saatler önce
Never too late to learn :) Start with arduinos, they're amazing little chips that can do wonders. Even something simple like a temperature sensor might be fun.
5MadMovieMakers 9 aylar önce
Hyped for the future of computing. Analog and digital could work together to make some cool stuff
Michael Isbill
Michael Isbill 2 aylar önce
@B-612 🤣🤣🤣 spirituality is the definition of quantum physics. Quantum physics and spirituality go have the same definition, invisible forces that influence the physical world. We aren't deviating from God by advancing technology. Technology is advancing so that we can perceive God on our own terms rather than be told whether we can or not.
B-612 2 aylar önce
God is analog, Satan is digital
Nathan Lloyd
Nathan Lloyd 2 aylar önce
@Teru Because they make some pretty neat beep boops.......
Laxman P.K
Laxman P.K 2 aylar önce
They are already working together.
Michael Isbill
Michael Isbill 2 aylar önce
Our conpu
Benjamin van Dijk
Benjamin van Dijk 3 aylar önce
For performing heavy mathematical operations on weak microprocessors analog computation comes in very handy as well! A great trick to do integration (and differentiation) on an Arduino is to use an opamp with a capacitor and a couple resistors to build an integrating (or differentiating) amplifier. With the ADCs and DACs readily available in that chip (same for PIC or other low clockrate UCs) it takes very little resources to get it going :-)
man of planet Earth
man of planet Earth 3 gün önce
gove him more info, bro!🤠
AhmedMohamed Hassan
AhmedMohamed Hassan 2 aylar önce
I think your idea is very good can you give me more info on it ?
Ian Jackson
Ian Jackson 3 aylar önce
I’ve heard about analog synthesizers delivering a distinctly truer tone than digital synths but I’ve never heard someone explain why that’s the case… Took me 2 minutes of this video to connect the dots and answer that question myself based on Dr. Muller’s introduction to the subject. Sad to think that improving educational formats and giving educators access to modern teaching resources is a political opinion and not an expectation in the West.
Chris Idema
Chris Idema 3 aylar önce
A synthesizer is basically analog (and nonlinear). A digital synthesizer will emulate some of the analog processes. In theory there is no reason why the digital one would perform worse. In fact if quantification noise is lower than the analog noise it might produce a purer sound. But in practice they can be limited in resolution (number of steps of the knobs) and CPU power.
Channel For adventures
I think that's the distinction that often gets muddled in translation. Digital IS the optimal way to PROCESS information at a general level. Analog is the optimal way to COMPUTE information at a specialization level. Our brains are digital systems in that they process 1s and 0s through activation of neurons and synapses at insane scales and simultaneity. But, BUT, each neuron, despite being connected into a vast network that makes up the brain, IS an analog system. It's just that, over billions of years, evolution has figured out the optimal way where both can simultaneously coexist and are co-applied to handle a variety of tasks with bandwidth and latency that has not yet been produced by digital OR analog systems independently or as one. We are getting there though, bit by bit. We also have to recognize that the brain is not the only computational system in the body. Our nervous system extends to all our extremities and into all our organs, and trauma of any kind (minor or major) is felt instantly from the point of contact back to centers in the brain where it is processed. So you have a staggering number of analog activation behavior that runs up and down all those channels, but where its processed both at origination and into memory. When you put your hand or finger near a hot flame, you feel that temperature differential and if you keep it there long enough, it hurts; a lot. This would mean that the nerves in your finger, thereby neurons, are being activated there and you feel the pain. This information called pain, is translated up the nervous system into the brain where its stored into memory. Factor in our senses, and our ability to learn (NN training essentially), that information is processed and stored into short and long-term memory. Thus, the next time we see a heated source, we know that we shouldn't put our hand or finger near it for too long and not on it at all. Of course, there's cases where despite the risks, we do it anyway; because we have moral or ethical imperatives that are driving us--like we want to save or protect someone or something, so we'll disregard the risk and pain, and injure ourselves to achieve that goal. But that's getting into a different line of thinking not specific to this. Ultimately though, there's a bit of commonality here in this example to Octopuses, in that they too have a distributed nervous system and where their ability to "think" exists beyond the central brain. To that end, I think it would be fair to say that we, like them, also have that ability to do the same, and over the course of time and have done similarly. We use our hands and feet to progress civilization to great lengths. Our dexterity is essentially a form of distributed computing where each finger and muscle actuation to manipulate or propagate something we want or want to do. All that is a composition of analog systems that ultimately are centralizing into a digital processor, whose primary purpose is to store useful information into memory and discard the rest. I should end things here or else I'll digress too far. Thanks for making this video.
HarryD 24 gün önce
@Yulfine We're talking about specialising for the task of matrix multiplication, which is part of the process that allows AI to learn to complete general tasks. The tasks are general, the computation is specialised.
Wristocrat Aylar önce
Paradigmatically, computation is the processing of digits
Yuri Vanhaeren
Yuri Vanhaeren Aylar önce
The way the video presented it my mind immediately saw that digital is great for storing (long term, perfect) while analog is great for temporary fast results (AI interpretation of binary data). I highly expect graphics cards to soon be jumping to analog processing too since, well, in a 3D render, does it really matter of you get pixel values that are slightly inaccurate? JPEG compression says no. On the other hand, crypto mining doesn't seem like something that can tolerate inaccuracy. Although maybe an analog computation can hint at values that "come close" at which point you have a limited search domain for the binary calculations. In other words, the analog computation functions as a heuristic, it might not be right but it's fast and cheap.
Yuri Vanhaeren
Yuri Vanhaeren Aylar önce
I'm thinking you should elaborate on what you consider the difference between PROCESS and COMPUTE. Given that we have chips that we call computer processors, I'm expecting a lot of misunderstandings.
creativebits 3 aylar önce
Just adding that evolution didn’t figure out anything. Evolution does not think. A designer does, like.. god.
Domenick Giambattista
It’s funny, for those of us who are into electronic music production, analog never left! There are lots of great analog synthesizers out there that can produce all kinds of complex waveforms, and some of us have been known to tack an oscilloscope on to modular gear to view and use those waveforms. Even some relatively simple gear can produce complex, “3D” structures with the correct cable patches. A lot of what you described at the beginning is the backbone of synthesis for music, and the same principles obviously apply to mathematical operations.
Anish Saxena
Anish Saxena 8 aylar önce
As a young PhD student in Computer Science, your explanation of how neural networks come to be and evolved, and the math behind it, is the cleanest and most accessible that I have come across. As I focus on computer architecture, I came to this video without much expectations of learning anything new, but I am glad I was wrong. Keep up the great work!
bartek burmistrz
bartek burmistrz 5 aylar önce
Have you checked out 3 blue 1 brown videos on neural networks ?
Vincent Zanada
Vincent Zanada 5 aylar önce
@Kelly Smith for the ai training algorithm
Kelly Smith
Kelly Smith 5 aylar önce
@Vincent Zanada ???
Vincent Zanada
Vincent Zanada 5 aylar önce
It's actually the first time I understand the mathematical meaning behing what such videos usually refer as treats and punishments...
falconhawker 5 aylar önce
How can "Computer Science" exist ? A computer is like a mathematical wrench or pliers, it is merely a tool. Computers can APPROXIMATE to a very high degree of accuracy . A computer can NOT be prove anything about any Scienctific Theory.
Dino Schachten
Dino Schachten 6 aylar önce
I'm always slightly confused by the miniturisation cited as a problem: CPUs and GPUs occupy an absurdly small percentage of the volume available in most computing devices. As long as heat is managed properly, you could theoretically fill up the 70 % empty volume in a tower PC with GPUs, and I'm not talking about the whole module which has to be big for logistical and ergonomic reasons. If you could manage heat and lanes efficiently, if it was just about that 5x5 cm chip, there's a lot of room to get 250+ times more performance out of a device the same size.
Zyborgg 6 aylar önce
I wonder how much heat is generated compared to size in digital vs analog chips like that, and how much cooling they'd need to work. Since analog doesn't use nearly as many watts I guess you don't really need that much cooling.
Matthew Funaro
Matthew Funaro 5 aylar önce
Can an analogue computer create another level of security, being harder to 'hack' being that they're focused on task specific and not multitask machines?
Surya. 8 gün önce
Nope. It's just different form of encryption
Escande Stone
Escande Stone 28 gün önce
Larion233 3 aylar önce
I'm most hyped for optical computers. I guess it's technically still digital, but almost zero heat and light speed interconnects throughout the whole system could usher in way higher ghz, or even thz processing speeds.
PaintDry 5 gün önce
i feel like any computer engineer you talk to would tell you that it would be much simpler if we could encode everything with analogue. the problem is the math involved is much more complex than that involved in digital systems, and theyve been to expensive to maintain due to their reliance on precise electrical currents and a general need for environmental stability. also your/our model of cognition is wrong. similarly to digital systems, its easier to think of neurons as firing or not firing but really they fire at different levels of intensity depending on a variety of factors.
funktorial 9 aylar önce
started watching this channel when I started high school and now that I'm about to get a phd in mathematical logic, I've grown an even deeper appreciation for the way this channel covers advanced topics. not dumbed down, just clear and accessible. great stuff! (and this totally nerd-sniped me because i've been browsing a few papers on theory of analog computation)
Steve Acomb
Steve Acomb 8 aylar önce
“nerd-sniped” lmao I feel exactly the same and here I was thinking I was way ahead of the curve on alternate computing 😂 jokes on me
mentaltfladdrig 9 aylar önce
Same here. But i didnt go to high school and mylife became a total mess and i havent graduated whatsoever :)
Victory Morningstar
Victory Morningstar 9 aylar önce
I'm smart too.
Gautam Bidari
Gautam Bidari 9 aylar önce
Absolutely. Love the way he covers the concept for everyone. Those who don't know in depth about it can still go away with a sort of basic understanding. And those who do understand it in depth will enjoy discovering new areas of invention that they can further explore. Looking forward to reading some papers on using analog computing in neural network applications
Chris Idema
Chris Idema 3 aylar önce
Brilliant to use flash memory technology for analog computing! When I think of analog computing I think of using operational amplifiers, resistors and capacitors to do the calculations. These are very limited in bandwidth, so I thought optical processing would take off. Lightmatter has produced a light processor that can do large matrix multiplications and additions. I didn't expect flash memory technology would be used. Very creative, I think we are going to see a mix of different technologies in the future, perhaps optical processors for training of NN and analog processors for execution of NN and digital for the rest. I'm sure digital will never be replaced since there are many problems where we need an exact solution and where a program needs to run for a long time without errors. Electronics will also never disappear since electricity is so fundamental, energy production and storage is electrical, sensors and actuators are often electrical, and we have powerful digital and analog processors electrical.
Never Comments
Never Comments 2 aylar önce
First comment ever on the channel! Although had been a big fan since long. This episode had been very close to my heart as being a computer scientist myself, the topic touches me greatly. I always believed that digital computing might soon be too slow to sustain the scientific growth as chips are approaching almost atomic level and large entropy itself that comes with binary conversions for large calculations. I believe to go subatomic we would need combination of digital and analog computing and hence the true power of quantum computing would be unleashed. Although, this might have a very interesting effect on Co-NP problems such as factorization (eg. RSA - cryptography). Would love a video on this topic, what happens if quantum computers able to decode cryptographic algorithms in polynomial time. what would be the consequence on digital security ?
Chris Long
Chris Long 2 aylar önce
When I was in the Army in the '70s, I was in a counter-battery radar unit (MOS 17B). Our radar was an AN-MPQ/4 and it was governed by an analog computer to direct artillery fire. It was very good at its job but if you looked at it wrong, it would go down. Almost all of its downtime though was caused by problems with the radar transceiver parts rather than the computing part. which was completely gear-driven and very robust like an old NCR cash register (amazing machines if you've never seen the innards of one!). Incidentally, one of my Chief Warrant Officers (CW4) was involved in its design and boy was he smart! He'd been in the Army since the Korean War and was an old fart when I knew him.
Catree Gün önce
Kids in 3000 realizing their brains are running digital and not analog: *visual frustration*
Jeff C
Jeff C 9 aylar önce
One of the first things I learned in Electrical Engineering is that transistors are analog. We force them to behave digitally by the way we arrange them to interact with each other. I'm glad there are some people out there that remember that lesson and are bringing back the analog nature of the transistor in the name of efficiency and speed.
John Chestnut
John Chestnut 8 aylar önce
@Chop Holtz Trolls are not appreciated. Technology and invention advance because people remember and think. But go ahead and act like a cookie cutter unable to do anything more than you are told to do.
Chop Holtz
Chop Holtz 8 aylar önce
Why are you glad about this? Does it make u feel cozy to hear something that reminds you of your memory when u gained your first electrical engineering education?
John Chestnut
John Chestnut 8 aylar önce
@John Malcolm Tubes are the answer. Overdrive the circuit and include clipping.
John Chestnut
John Chestnut 8 aylar önce
@John Malcolm Most people don't know what a transistor is. "It's in computers...I think?"
Isaac H.
Isaac H. 8 aylar önce
@Jason Barron not of accuracy, of precision. Analog can be accurate but not consistently precise. Also depends on use case and scale
Dino Schachten
Dino Schachten 6 aylar önce
I love how your videos explain so much more than what's indicated in the title - they are always more interesting that you'd expect. And yes, absolutely those analog chips will become a supplement, perhaps the next generation GPU while CPUs remain more or less digital, or perhaps most processing units of the future will have an analog side chain.
mikeall Aylar önce
As part of my job, I work on a lot of control systems. Analog controls have a far superior accuracy when compared to digital. The sampling rate limitations of digital can be a problem. But analog systems are finicky and drift, requiring a lot of maintenance. Plus digital systems are easier to make redundant. But digital can never hold a candle to the infinite sampling of analog.
Shyam Trevino
Shyam Trevino 4 aylar önce
Analog computers --> punch cards --> room sized computers (ENIAC) --> microchip computers --> optical/quantum computers? Perhaps it will loop back again! I believe that analog computers are indeed the future of AI. Optical computers are naturally analog too, and they combine the speed of the photon with analog computing. My colleagues mentioned that Procyon Photonics is a new startup that recently made a breakthrough on that sort of exciting technology. I hope they, or someone else, find out how to integrate neuromorphic computing into all of this. That's what I'm really excited about.
mage 3 aylar önce
Well, not quite, quantum computing can mimic analog computing quite easily. But for the price, I could definitely see analog computing being used in place of quantum computing until we get a really good quantum computer.
Price 3 aylar önce
So in the far future, after analog, were going back to punch cards, room sized computers, then back to microchips? Its evolving....not backwards but in circles.
A\O Digital
A\O Digital 4 aylar önce
Carbon Roller Caco
Carbon Roller Caco 6 aylar önce
Another specialized use for analog hardware would be one that specifically utilizes their imprecision: reliable randomness in computer simulations and games. I wouldn't put it in all games, of course, since there's still value in patterns in that area-total randomness could unfairly corner a player with terrible luck, and part of the fun of certain personal game challenges like speedruns is finding how to consistently foil the RNG. But for something simpler like digital deck shuffling, wind velocities in golf and flight sims, and road conditions in driving sims, especially with multiplayer so no one can use knowledge of RNG for a technically-fair-but-against-the-spirit-of-the-gameplay edge, the idea can work. And, of course, Monte Carlo simulations.
Brice Roberts
Brice Roberts 4 aylar önce
As I sit here working on my NLP research, I happened to get this video suggested to me. Thank you for covering this field, and doing such a great job explaining the history of neural networks. I love your videos, and thank you for the breadth of knowledge and experience you impart on your viewers. Dr. Muller, you are an amazing champion and advocate for the STEM field, and I think what you do with your channel is exemplary. Thank you!
CloudyDaze 6 gün önce
This is the first discussion of Artificial intelligence that I've heard that didn't made me roll my eyes and actually had me not only entertained but intrigued. Well done. C:
Felix 10 gün önce
This video is awesome, and I learned a lot from watching it, but one thing really confused me at the end. Towards the end, there was an implication that an analog neural network would have to be physically present in any device in which it is used. But neural networks aren't computationally expensive to use (to compute an output given an input), they're only computationally expensive to train. Is it possible to train a neural network on analog and then export the network weights onto a digital system? It seems like that would be the preference. It doesn't even have to be an easy process. Google can use analog microchips to train parts of their search algorithm and retrain as often as they're able, and as long as they can export the weights back to digital, no analog needs to be involved when I Google "what the heck is neural network". I can see how one part of the problem could be extracting the values stored in each weight (measuring the conductivity of particular repurposed storage gates), but in that case, you could focus on making the chip more modular for any manual steps needed to read the values instead of focusing on making it more compact and perfectly efficient. Even if you have to train the model, test it, and then tear it apart to get the weights, it would still probably be worth it for the portability of the end result.
james kirk
james kirk 6 aylar önce
I have been thinking about this for a while, and although size and power still seem to be a bottleneck, I see future AI to be tunable lasers, fiber optics, and optical detectors. Very much mimicking the brain itself. Wavelength being the neurotransmitters, signal strength, duration, etc, I can only imagine a neural network being an analog computer of the future.
Follow Media
Follow Media 9 aylar önce
as a NAND flash engineer that bit about the usage of floating gate transistors as analog computers is interesting. particular because in flash memory there is a thing known as "read disturb", where even low voltages applied to the gate (like during a flash read) to query it's state can eventually cause the state itself to change. you would think it is a binary effect where if it's low enough it would just be a harmless read but no...eventually there will be electron build up in the gate (reading it many times at low voltage has a similar effect to programming it one time at a high voltage). in this particular application the weight would increase over time the more you query it, even though you didn't program it to be that high in the beginning. it's interesting because, it's sort of analogous to false memories in our brains where the more we recall a particular memory, the more inaccurate it could potentially become.
le berger des photons
le berger des photons 9 aylar önce
@Thurman Zhou I"m sure you've worked on some interesting things, but I believe that there are projects which are google plexes in advance of what we get to see. "They" have been running the patent offices around the world for a long time. Albert Einstein started there. He was an actor and not a scientist. They brought down the wtc by turning it into powder in less time than it takes for an object to fall from the height of the roof. It seems they can transform matter back into the light it came from and let it turn back to mass while it loses its former coherence. They could clearly have light computers, light storage, light retrieval without us knowing.
Thurman Zhou
Thurman Zhou 9 aylar önce
@le berger des photons Timing and light. If they were able to make a computer that operates on light, good for them. You know fiber optics? What it does is pretty amazing. Besides the fact that the laser switches on and off to get a readable signal of 0's and 1's. Timing. You have to have super good clocks at both ends that are synced. Nothing in the digital world would work without it. Ever time you talk on the phone, text, you use your computer, streaming services, writing this very blurp, it's all timing. This is just straight linear. We are not talking about splitting the light or holding it somewhere to retrieve it later. ... somebody has to make this stuff work. I know a LOT about light, more than most, but not as much as some. The resources and time that you throw at project, do you do this that's insanely hard ( of course you still have people look into it) or do you put the time and effort into something that is hard, but is doable? Here's the thing: ATT in the 1960's made thousands of miles of fiber optic cable. Once they deployed it, it didn't work. Sure you could get a signal for about 15 meters and it was useful in those cases. For communications over long distance, it was not. The fiber was dirty even though it ' looked clear'. It took some people till 1985 to find out what was wrong and how to correct it. Concurrent with that was the increase in computers and other technologies. Since then, Al Gore invented the internet.(sic) In my opinion, a light computer was built but had problems. Wouldn't matter anyway. It couldn't have exceeded what we have now... 2^n. I don't know what the benefit might be. I can speculate, Speculate, on some pretty far out stuff, and don't need drugs to do it. What do you think 2^ google looks like? (Not the company Google)... or something really out there a googleplex. A google is the number with 100 zeros, A googleplex is that number of zero's. For example if you have 1 million, you have 6 zero's, so to have a million plex would be a million zero's
Pradeep 9 aylar önce
@JeyeNooks indeed it is!
Thurman Zhou
Thurman Zhou 9 aylar önce
@le berger des photons of course anything can be faked. However, it seemed reasonable that light just wasn't a good way of computing. Believe it or not, among things that aren't linear, work is still being done on cold fusion. Up to the quantum computer, it's still mechanical, analog or digital. With the analog, you are just hard wiring the process. The first census cards were done that way
le berger des photons
le berger des photons 9 aylar önce
@Thurman Zhou They can publicize there being a problem with it just like they can hide the solution. Just like they could have faked the problem, since only they had it and knew how it worked they could tell us anything. I see you didn't localize it in time. I bet they have it all worked out, that the place in the south west usa where they guided a whole river through a hole they made in the rocks allegedly to cool a bunch of silicon, that if you measured the temp at input and output you'd see that they were the same right before they put you in guantanamo for life. That it's yet another place where they can steal tons of money from the gullible public. The real computing power keeping taps on every human on Earth is maybe in a cubic meter of space that actually gets cold rather than warm in operating and is placed under the guardians' butts and they don't know it's there. We saw at 911 that they've got free-energy down pat, it's the only way to make 1 million tons of steel and concrete turn to powder in ten seconds. If you know as much as you say you're likely part of the system, sure, it's likely that you're fooled too about what's going on, that you believe what you say.
Alexander Banman
Alexander Banman 2 aylar önce
Awesome series, so cool to see the potential for analogue and the breakdown of the technology. My theory is that when robots finally achieve sentience, they will be much like us.
James Trades
James Trades 10 gün önce
It blows my mind how far humans have come. This is such an interesting and scary time to live in, when I am 60 years old (30 years from now) the world as we see it today will be completely different.
Phil Truthborne
Phil Truthborne 6 aylar önce
These visuals combined with a clear explanation is so well made that i'm barely having any trouble at all grasping this, and i've never studied this area before. Absolutely amazing presentstion and teaching!
SPXRKZ 6 aylar önce
Very informative. Interesting how it could potentialy lead to machines taking over or being less controable than humans desire...
Mitoni 9 aylar önce
This was one of the best layman's explanations of neural net training models that I have ever seen. Awesome content!
Paul V
Paul V 9 aylar önce
i am a layman I did not understand a bit of it, pun intended
patakk 9 aylar önce
but it isn't, he literally said he's going to skip back propagation (which is how models are trained nowadays)
duong chuc
duong chuc 9 aylar önce
kxllo 5 aylar önce
my vision is that the future of humanity will have a mix of the two ,analog is best for specific tasks, take digital mix it up w it speed up their procces of it learning the specific task ,its gonna be a lot faster if you just combine the two technologies toghether .you build the specific analog version what do you want the machine to do and you just speed up and filter it using digital ones.i think a mix of them two is going to be a rly huge change bcs you have machines that will be best trained for the specific task its way more efficient you dont have a machine that can do everything but restrain it to smth specific all the power that it has is gonna go to waste you have a designated machine fit best for the task you want and the error margins are gonna lower over time
Chrisi 3 aylar önce
Derek seems to be impressed by analog computing the time he first heard of it 👍 And now he‘s presenting it perfectly interesting and intuitive😁
lucabaar1 2 aylar önce
black & white (binary) VS indefinite respective relations (analogue). The idea is how to discern / correlate in a way that holds computational integrity in an extensive system. The most fascinating thing about analogue computing is the fluid ambiguity of it compared to traditional binary computing, quantities of series' of 1's & 0's VS breadths of expressive potentials via relativistic determination of the relevant inputs. Such a superior application of variable expression for representing complex phenomena as opposed to binary computation.
Dingoniner 4 aylar önce
Yes! I KNEW analogue computers were going to be the key to AI. Ever since the 90's, I've believed that digital computers have an upper limit, and aren't really suited for simulation and AI.
Trevor Moore
Trevor Moore 9 aylar önce
For amusement only: My first day at work was in1966, as a 16 year old trainee technician, in a lab dominated by a thermionic valve analogue computer (or two). These kept us very warm through the winter months and cooked us during the summer. The task was calculation of miss distances of a now-obsolete missile system. One day I was struggling to set up a resistor/diode network to model the required transfer function, but the output kept drifting. I found the resistors were warming up and changing value. Such are the trials and tribulations of analogue computing....
NeglectfulSausage 9 aylar önce
@Deep Sleep Child labor laws are a major problem that have destroyed their capacity for rational thinking. If you worked at 12 years old and discovered was self sufficiency meant, and what it cost to actually have things and trade dollars for things, they'd be viscerally defensive of people's property. When you raise a generation that is given everything, they turn to communism which is a desire to remove property rights from individuals because "well I need it more than you so I take it".
Deep Sleep
Deep Sleep 9 aylar önce
@NeglectfulSausage : I was thinking more along the lines of child labor laws. I would agree that some 16 year olds have the skill to do such things, but there is accountability.Should a mission critical process fail only to find that a 16 y.o. originally worked on it, probably not good press for the company.
Thorny 9 aylar önce
@Stefan Griffin LMAO resistance is a physical innate property of a substance. It cannot be made or stored digitally. It's like saying "Digital Temperature" or "Digital Friction" XD
Alfred Clausen
Alfred Clausen 9 aylar önce
@Stefan Griffin Not bad because you can keep the whole assembly at a fixed temperature a little higher than room temperature and thereby prevent temperature drift. Read about crystal oven!!
NeglectfulSausage 9 aylar önce
@Deep Sleep For what its worth, there is no field of technician which requires more than a slightly above average IQ. Thomas Jefferson ran his mothers store at 13 years old, moved to america mainland at 16 and went into college, and was leading troops at 19. Alexander the great was also doing his business by 19. You're gauging older humans capacities by the standards of today, which are "let people be adult children".
Ashley Broening
Ashley Broening 4 aylar önce
This is so interesting. I watched your "From Transistors to Quantum Computers" videos a few months ago and thought it was so interesting to learn what Moore's law was and why it was relevant today. Then this morning, I got a notification from SciTech Daily (which I usually get articles about technology, astronomy, and other related topics) about the next upgrade in twisted multilayered graphene structures, which are being researched for their application as superconductors, which I hadn't heard of before. It immediately reminded me of the Veritasium videos about quantum computing, and of Moore's law in general, since it was research into a molecule-sized conveyor of signals, like what the professor from those videos was researching. And then I found this video about analog computers and I feel like it's all connected in some way, possibly...I feel like surely there must be someone who can test the use of twisted multilayer graphene in the advancement of computing and data storage/retrieval. Maybe it can be used to overcome the Von Neumann Bottleneck by negating the issue of long data retrieval times...I bet that alone would reduce the energy consumption of training the neural networks simply by making the process finish more quickly. I'm sure there's a subreddit somewhere for quantum computing and stuff, but I'm not sure if I could find someone who would either already know what I'm talking about, or would be willing to spend the time referencing the articles and videos I've seen in order to understand what it is that I'm curious about.... It may be something no one's started doing, or it may be that it's already been done and just hasn't been mentioned anywhere that I could find by googling, lol.
Zagstrug 6 aylar önce
I really love the way this video explains every step that was needed to get us here. Each mentioned topic makes sense in the context of of understanding the big picture and the transitions between them were done so well!
cocoritosss Aylar önce
It looks efficient. Although a self-taught machine may need a way to tweak the conductance values on the fly, which is a interesting question mark... anyway love the idea of analog flash memories used as NPUs. It looks like a fpga.
xEricC1001x 6 aylar önce
I love your conclusion about how to get AI to mimic the brain we'll need to incorporate the analog ability as well that is such a cool thought.
Septimius 9 aylar önce
I see Derek is getting into modular synthesizers! Also, funny to see how the swing in musical instruments to go from analog to digital and back is being mirrored in computing generally.
lolilol lolilol
lolilol lolilol 8 aylar önce
@Modulated Horizons I concur. DUNE 3 is definitely one of the best sounding digital synths out there, pretty much on par with the best sounding analog synths imho.
lolilol lolilol
lolilol lolilol 8 aylar önce
@N. Z. Saltz 1990s: ok computer 2020s: ok boomer
lolilol lolilol
lolilol lolilol 8 aylar önce
Ah ah I thought exactly the same: that analog computer was exactly a modular synthesizer.
Brayden Groshong
Brayden Groshong 8 aylar önce
Yogev Montekyo
Yogev Montekyo 8 aylar önce
Modular, Additive and substructive synthesis ;)
Oon Chusattayanond
Oon Chusattayanond 4 aylar önce
This is seen in modern day commercial ICs as well that analog and digital domains are utilised to their own strength. For an obvious example, digital is very good at adaptability and repeatability which can be used to assist in trimming and/or calibrating of the analog circuitry. Do keep in mind that ADC and DAC performance (depending on who you ask but ...) are largely analog component dependence.
Neil Barnett
Neil Barnett 3 aylar önce
The most important lesson I learnt on analogue computers was that you have to keep multiplying or dividing by 10 and remembering how many times you did it. Only one step above the slide rule. And the operational amplifiers in the modules were noisy and made your results unreliable. I suppose they do do the multiplier tracking themselves, nowadays, but in the 1970s it was the biggest obstacle. I note that you have a digital storage oscilloscope to display the analogue results.
Thomas Maiden
Thomas Maiden 4 aylar önce
"Zoe the Robot" always enjoys a good VERITASIUM video. Now I am planning to see if adding motor inhibit relays may be a good idea, to prevent the robot from jumping when power is first applied to her drivers. While Zoe's A.I. system is quite complex, she could benefit from additional programming.
tramsgar 4 aylar önce
Thanks for a great video! Unsurprisingly, with analog NN the first percieved application is: More efficient integrity intrusion (and implicitly military and ethnic separation applications, of course, but let's call it 'separating good food from bad food').
Dust 9 aylar önce
The problem with this system of computing is that interference is a huge factor. When you only test to see if there is voltage or not then you don't need to worry about interference. But when you get into building systems that use varying voltages say 0v 0.5v 1v then you need to worry about interference, and the more numbers you add the greater this factors into being an issue. Interference could come in the form of microwaves, radiation, mechanical vibration (think fans cooling off a server rack,) and the list drags on as almost anything can cause interference. That ostiliscope used in the example is an expensive piece of equipment that minimizes interference, but the cost is far higher than with a binary number system.
Eugene 8 aylar önce
The world needs AI artificial intelligence so all people don't have to work at all
JwebGuru 9 aylar önce
@Phat Oof That's sort of true, but you can also actually build self-stabilizing analog systems, whereas digital ones tend to fail completely catastrophically on a single bit flip. So there's a rather interesting tradeoff there. For that reason, I'm pretty sure the *most* robust hypothetical systems would be analog, not digital. Of course this is all hypothetical since the engineering burden for trying to do general purpose computing like that would be insane.
TAP7a 9 aylar önce
@Just A User there's a series of very funny images you can find where you take a trained NN, show it something like a banana which it will identify correctly at like 85% confidence, then add a miniscule amount of specially calibrated noise and the NN will spit out something completely unrelated like "moose" at 99.9% confidence. It's 100% exploitable, easier if you have access to the model itself but you can absolutely just train another adverserial model to identify noise additions that make the original model's predictions as wrong as possible
Poornakumar Das
Poornakumar Das 9 aylar önce
@Ekaros Fundamentally error arises in Digital processing because of "discreteness" of the data due to "sampling". So no Digital data is without error (unless we have infinite bits). It is called Truncation error. Analog systems don't have it. They may have other kinds of errors, but none built into the system.
Ekaros 9 aylar önce
You can correct for digital errors. Or minimize them, but you really can't say if analog data is erroneous or where the errors come from. And it might even carry forward like with calculations. One mistake early will cascade and be also present later. Or even have greater effect.
webocoli 4 aylar önce
40 years ago I was working on state-of-the-art analogue gun-fire control computers - took ages to set up but they were certainly fast in resolving multiple input parameters. Their main component in resolving complex outputs were devices called resolvers which finally drove power amps to drive guns etc. Facinating stuff !!😁
rgw 2 aylar önce
just to kill people
y0y4y0 3 aylar önce
Analog is so much more powerful that it's sometimes surprising how we switched to digital in some applications as, for example, power steering a car: steer a minuscule degree on a modern power steered car and it will completely ignore you, so you basically have to turn the wheel in small corrective movements the entire time (like when you did on you ps2 by making pulses on the stick instead of just moving it a little bit
y0y4y0 2 aylar önce
@D Tibor just pay attention next time. Tbh it's more noticeable on the first versions, but you can definetively feel it. As for my car, it is mecanical, so analogical.
D Tibor
D Tibor 2 aylar önce
Or maybe your steering system is broken :)) never experienced such behaviour
Jay Renault
Jay Renault 4 aylar önce
been saying this for a long time, we need to hybrid our technoloies when possible, if only to see how it turns out. Always found it weird that when new techniques and tech pops up, we drop the last ones as if they never were that useful, when really they could do so much more with modern applications.
Andrew Bauer
Andrew Bauer 6 aylar önce
What would be an interesting video is to compare digital/analogue computation in this video, with the two sides of thinking you did in "The Science of Thinking" video. I can see a correlation between, the fast side of thinking that can be inaccurate with analogue computation, and the slow sides of thinking that are more accurate with digital computation. The correlation is not a direct overlay, but has enough similarities to be analogous.
Persona 9 aylar önce
My professor always said that the future of computing lies in accelerators -- that is, more efficient chips that do specific tasks like this. He even mentioned analog chips meant to quickly evaluate machine learning models in one of his lectures, back in 2016! It's nice seeing that there's been some real progress.
PleaseDontWatchThese 9 aylar önce
You could call an analog computer a accelerator
Lundo 9 aylar önce
@Alex T they don't require unique HDL code. The dominant languages are as you state verilog as well as VHDL though. I'm more familiar with VHDL. What is unique to each chip model is the constraint file used to map pins to the ports you've defined. An FPGA can be programmed to whatever hardware design you wish (as long as it's within the physical limits of the chip). This means that they can emulate just about every digital logic chip on the planet including ARM and x86 chips and even Gpus. FPGAs are development boards used for designing these chips and were never meant for a consumer to use. This is probably why they're used in the retro space. Computers from the "retro" era were fairly simple such that you can actually implement their designs with discrete logic gates. FPGAs just make that easier by only having to write HDL code to emulate the system.
Alex T
Alex T 9 aylar önce
@Lundothey're actually being used as product in the retro space. AFAIK, each fpga model requires unique HDL (verilog) code. I'm a compsci major, and the stuff you engineers do seems like magic to me.
Calculator 9 aylar önce
@Lundo never said or implied you were dumb though? 🤨 That was in the past. The present FPGAs are almost as fast and cheaper than ASICs. Anyway not gonna discuss with someone like you. You think you are the only one with a degree?
Lundo 9 aylar önce
@Calculator The only advantage FPGAs have over ASICS are that they're reprogrammable. But generally you don't need to remake the logic for the device you're using. FPGAs are meant for *development* of hardware that will later be fabricated onto silicon.
Karyuu 3 aylar önce
The problem with progress is thinking we immediately need to abandon the old things. I think we would be more advanced if we tried using both systems, each where appropriate.
Suspense_Comix 2 aylar önce
Seeing Veritasium make these videos about computers impressed me. I am on the path of making mechanical computers which are powered by springs or cranks. NONE of the parts shall be electrical or anything. Not even a motor can power the entire computer.
Scott Meridew
Scott Meridew 20 gün önce
Great video. Seems the future may be a hybrid solution involving both analog and digital processing working in harmony.
Ivan Angelov
Ivan Angelov 3 aylar önce
Super informal video. Bravo! The same way the more powerful but not that flexible architecture of GPU is integrated as peripherals to the computer, despite being a computer on it's own, the same will go to analog modules, quantum services, along with all sensors and peripherals integrated into the systems. Sure time is needed for all this to be fit together in the best way, but improvement will follow. The same way there was a trend to make everything electronic, but later way reformed to using every tech where it fits best, the same way this will have some poor implementations till it gets refined.
1wsx10 9 aylar önce
wow that analog chip sounds extremely competitive. im surprised they already have something that good. mad props to the guy who figured out the hack with the flash storage
ROOOMBA 9 aylar önce
@Ole Stilling So it's not the question of "how tolerant do you design it", but "how tolerant *can* you design it" / "do you need a heating/cooling mantle for this or that usecase or not".
ROOOMBA 9 aylar önce
@Ole Stilling I asked this because the elements in analog systems are well known for changing output voltage based on the temperature, this has made the operational temperature range of these devices (when they get more complicated) historically way more limited than their digital counterparts. Just wondering if this technology is more tolerant than other solutions.
Ole Stilling
Ole Stilling 9 aylar önce
@ReZhorw not really. Analog computers existet long before the digital (antikythera) , when the digital came along ( Ada of Lovelace) it still took a long time before the digital computers gained grund. In that space of time analogue computers reingened, in 1948 ( or about that time) analogue machine learning was demonstrated and later we as examples had analogue models of the electric distribution grids. I used one in the 1960th.
Ole Stilling
Ole Stilling 9 aylar önce
@ROOOMBA it is as temperature sensitive as you design it. ( wrong question)
dcamron46 9 aylar önce
Carver Mead already did this in the 80s at Cal Tech. Nothing new here! Sorry.
MynameisBrianZX 22 gün önce
Would it be more accurate to say that analog computation is finding an important new application (neural networks)? Because my naive thought is that sensors and analog signal processing circuits count as very common analog computers. Or is there an important factor that defines “computation”?
joe bro
joe bro 3 aylar önce
My dad did cable and when digital first came out and he was installing them he said the cables weren’t actually digital cables. The reason he knew this was he noticed the cables bled signals like crazy and he knew that digital required a 100% efficiency to actually be accurate and work (like this video pointed out you can’t have any variation for digital or it sees it as a mistake/error) so the first digital cable line we’re actually secretly analog and they just wanted to jump on the bandwagon and profit off the digital boom
Batman 3 aylar önce
if digital requires 100% efficiency, analog requires 500%
Jo 3 aylar önce
How can a cabel be digital? Isnt it just about the interpretation of the analog signal on the cabel that makes the data "digital"? Which of course can have errors, thats why we have error correction codes for digital data, no?
guppy277 Aylar önce
"Now the size of a transistor is approaching the size of an atom. so there are some fundamental physical challenges to further miniaturisation" What an articulation ...! 👌👌 I am no scientist, nor an electronics engineer. Yet i entirely understand and relate to what this is ...! Feels great to be living in these times of TRvid with the likes of Derek, Tim Dodd, Tim Ellis etc guys ...
Api Turaga
Api Turaga 6 aylar önce
I have worked in Instrumentation and controls for more than 20 years... we call a lot of things digital controls... but in front of your eyes, all moving machines are simply analogous... pick any. A one or zero, or a yes or no on a solid state relay when converted to physical movement is always switched to 0 or 100%, that's conversion back to analogue. And the beauty of analogue is that you can hack that movement to stop the travel anywhere you choose, and spoil the digital myth.
Handsome von Derpinson
This actually helped me a lot to understand how neural networks work in general. For me it was kinda like black magic before. It still is to an extend but to know that moden Neural Networks are kind of more complex multi-layered perceptrons helped a lot.
fiso64 9 aylar önce
@Al Addin I'm not saying that it can't be defined, I'm saying that we haven't made a consistent definition yet, as evidenced by the many different theories surrounding it, all of which have a different definition of consciousness. This term is closer to a loose philosophical concept than a mathematical definition. Therefore you can't say that it cannot be computed as long as you don't at least explain what theory you're using. So no, I don't think I've embarrassed myself.
Al Addin
Al Addin 9 aylar önce
@fiso64 No i am not afraid, but according to my experience, discussing complex scientific topics with, arrogant, respectless, provocative know-it-all pretenders but laymen leads to nowhere. A small hint to your contradiction: First you said, that there are many theories and disputes about consciousness. You claim that consciousness even can't be defined. But then you admit, the existence of the hard problem of consciousness, which only can be postulated, if there is a definition of consciousness in the first place. Now what smart a..? You clearly got no clue about that topic. This becomes clearer the more you speak and twist things. Go ahead and ridicule yourself even much more. Enough youtube debates for today Bye
Mastakilla91 9 aylar önce
Funnily enough, this was the most efficient (effectivness/simplicity) explanation of Neural Networks and Machine Learning I have ever encountered.
fiso64 9 aylar önce
@Al Addin So you're just not gonna mention those mistakes I've made and claim that I've just googled it quickly.. Are you afraid of a genuine discussion? If that makes you feel better, I guess.
Al Addin
Al Addin 9 aylar önce
@fiso64 You mix up things together by googling five minutes, but insist on your claim, that it was me, who had no clue about the topic? Do you actually realize, that you contradict yourself in your last statement? If you are not ready to do a proper research, then please don't waste my time, by simply trolling and attacking people.
doctoroctos 11 gün önce
Digital is either 0 or 1. Analog is anything in between. Programmable resistors in a mac function is a cool concept. How would the system control the reliability and repeatability given age, temp, process variation, defect growth?
Walter Brown
Walter Brown 4 aylar önce
Over 50 years ago, I was working for a US Defense contractor in the electronic warfare field. One of the lab tools we used in our problem simulations was an EAI - 380 analog computer which used large "plug boards" to program the simulations. Using a computer such as that requires more than a casual knowledge of the computer elements such as operational amplifiers, and the circuitry/feedback networks required for the simulations. For electrical engineers, this was "right up our ally". Program (file) storage consisted of the large plug boards on which the simulations had been "patched". Analog simulations and programs are practical for small scale problems which by design have limited scope, but larger systems level scope problems seem to be better addressed with digital computers. Digital computers have been in use well before the advent of solid stated devices - I believe most of the "Bletchley Park" efforts during WW2 were done with huge vacuum tube digital computer systems. Bottom line - Analog computers are certainly not new, and various forms of non-analog computing have been around for a while as well.
javed00 5 aylar önce
Loved the video, well explained and sufficient depth to trigger those neurons! Definitely stirred curiosity in me to learn more.
Magnulus76 6 aylar önce
Analog matrix multiplications is a brilliant idea where you need fast results over perfect precision, like in certain kinds of simulations or in AI implementations.
Andrey Lebedenko
Andrey Lebedenko 9 aylar önce
4:52 NO! This is wrong! It is NOT how real neurons work, not even close. For starters, each neuron can collect multiple signals from the same input BEFORE deciding whether to fire or not. And the strength of a single input is not necessarily the main component, in some cases repeatable, but lower ones will be more important. Second, the neuron has an internal mechanism of feedback, looping back its signal. Third, other neurons can affect that loop-back mechanism too.
Major Pain
Major Pain 6 aylar önce
I work in Edge computing and was impressed with your example of how neural networks work. I will borrow that if you don't mind the next time I am explaining it to an engineer who is querying the workings of such systems.
Paul Jenkins
Paul Jenkins 2 aylar önce
These two videos are among my favorites now. It brought disparate ideas and facts, I've accumulated, together so cleanly and simply.
soundspark 9 gün önce
Could analog computing be used for graphics processing, where a slight bit of noise/graininess might not be an issue, and may even help dither out banding that plagues too many games?
Luke Dykstra
Luke Dykstra 3 aylar önce
those analog AI chips are very interesting, itd be cool to see them in smartphones possibly one day
Frag EightyFive
Frag EightyFive 9 aylar önce
Using an analog computer to demonstrate differential equations is a perfect teaching tool. I really wish tools like this were used in colleges more often.
Frag EightyFive
Frag EightyFive 7 aylar önce
@The Whiskey Gunner I have had many professors or coworkers like this. They have a wealth is knowledge but for many, myself included, they seem to be terrible in communicating. I might as well read a text book or white paper, alone, in a dark basement. LoL 🤣
MinSang Ngaihte
MinSang Ngaihte 9 aylar önce
Can I play call of duty on this computer?
The Whiskey Gunner
The Whiskey Gunner 9 aylar önce
(Nein! White board and monotone voice only! No questions! No demonstrations! What do you mean explain this, it's on the board! Don't you have eyes!?) ^95% Of all math classes from elementary to bachelor's degree. I agree with OP, anything that could have taken math from abstract problems on paper to the real world is beneficial to everyone in the class that isn't the one kid that just "gets it"
Elrog3 9 aylar önce
@Noootch Fair enough. You are correct that he just said demonstrate differential equations and not specifically a differential equations course. And your second comment is partly correct. I do only want to take the courses that are required for getting the position I want, but I don't base it on salary. I would rather settle for a modest salary and have more control over how many hours I work, where I can live, and who I work with. Do you have something against students holding that attitude?
Elrog3 9 aylar önce
@Frag EightyFive Visual tools are great. 3Blue1Brown (here on youtube) has some excellent examples of visualizations that help with understanding math concepts. Having a analog computer on hand to demonstrate with is unnecessary and I would argue that the videos still do a better job. And if it were to go as far as making students physically use the analog computer, well then you are just flat out wasting time.
NaTzu1001 3 aylar önce
My thinking: Yes we need a mixture of analogue and digital for better computing. The digital portion reduce noise so that many analogue operations can be chained, for a deeper machine learning algorithm, acting as the activation function. But actually, this might be replaced by an analogue activation machine as well. We human uses analogue and digital combined all the time. In computers, the analogue parts speeds up the GPU part, which is highly parallel. The digital parts helps with the general purpose parts, those more logical and discrete, the CPU part. That's why we studied discrete mathematics in computer science, as well as actual maths like matrix and trigonometry. Can't wait to see future GPU marketing their number of neurons instead of their wattage. Another field of study that I was thinking while watching was quantum computers. I consider them as both analogue and digital systems as a qbit contains analogue information, yet we programme them to do digital operations (for now).
jubuttib 6 aylar önce
16:50 Isn't this (using the cells with varying levels) basically similar to modern M, T and QLC NAND, where they store multiple bits per cell by setting them to varying levels? Main difference being that they're using the full range, rather than trying to match e.g. 16 specific levels of a QLC NAND cell uses to store 4 bits.
Travis Carpenter
Travis Carpenter 2 aylar önce
In Star Trek Next Generation, the android Data had a brain consisting of a "positronic matrix." He created a daughter with a similarly designed brain, who subsequently died due to instability of the matrix. The show started in 1987. Although Data had digital components (storage 800 quadrillion bits, or about 100k TB), they did not imply in any way that he was purely digital. I think it's cool that they recognized back then the combination of analog/digital when creating the futuristic android. They simply called it the made-up "positronic matrix" to fill in the gaps.
Edson Andrade
Edson Andrade 6 aylar önce
Add some already commented here, your simple and by example explanations make it easier to understand and assimilate the information you’re trying to pass on. Thank you!
Joe Sterling
Joe Sterling 9 aylar önce
The biggest issue is distortion. Inexact calculations due to imperfect components, degradation of the data when transmitted (wired or wireless), external EM interference, all conspire to make the use of analog a special challenge. Mixing digital and analog to play to the strengths of each along the way intrigues me. I'm old enough to have experienced the full evolution of digital computing. My mindset is therefore quite biased toward it. What you propose would be quite the eye opener for me, if it actually can be made to work as prolifically as current digital technology.
NMCCW 6 aylar önce
Your mindset is biased towards digital, yet your mind is ironically theorized to be a digital analog amalgamation.
Rafael Perez
Rafael Perez 9 aylar önce
@Opsse you bring up very good points, but I think what the video failed to mention is that scaling from every new CMOS node tends to favor digital performance. Analog performance is actually pretty bad in various forms for the highly scaled CMOS nodes (below 22 nm). Scaled transistors suffer from a variety of nonidealities such as DIBL and other short channel effects that reduce gain, speed, etc. And even interconnects/metal layers like you said are more problematic. If you look at the chips from Mythic, they're actually using a 40nm cmos node for these exact reasons and also because it's cheaper.
Opsse 9 aylar önce
As a PhD student in this field, I can answer some of your questions. Yes, we usually talk more about noise than distortion. And thermal noise is not the only issue, there is read and write variability, resistance drift in time, the resistance of interconnections, ... However, it is true that neural networks can sometimes take advantage of the noise to avoid overfitting, but only a reasonable amount of noise and only in some cases. Self-correcting algorithms and error-correcting are options, but it's not that easy. Usually, this kind of method sacrifice the performance or requires more energy (which is the opposite of what we want). About the mixing digital and analog, they presented it nicely in the video, but the digital/analog converters required a lot of energy (sometimes more than the vector-matrices multiplication itself). So we don't want to do it too often.
leftaroundabout 9 aylar önce
@StevenSiew2 that's true, but noise is something that AI needs to deal with anyway because the inputs will always be noisy to begin with. It can actually be useful to _add artificial noise_ while training a digital NN, to avoid overfitting issues. (Stochastic gradient descent can also be seen as a way of making the training “noisy”). As long as the pertubations are small and random, training won't be affected negatively. Distortions however are hard to deal with. You may be able to train a model on a particular chip that has such and such distortion; because the distortion properties don't fluctuate and constant-but-unknown biases, the weights will ruthlessly overfit to this particular chip, and then it probably won't work at all on another copy.
StevenSiew2 9 aylar önce
Distortion really? I am under the impression that the biggest problem with analog computer is NOISE. You can never get rid of noise in an electrical system. Even if the hardware has no distortion, the inherent thermal noise in the system will cause some small calculation error.
Thomas Dent
Thomas Dent 6 aylar önce
This is one of the best youtube videos i've ever seen. I'm a biology science guy, but love this channel for incredible explanations in areas of science I don't know as much about. Keep it up brotha
G M Aylar önce
I’m curious what your take on Quantum computing is? I have seen a growing number of skeptics on the topic recently. Some were pretty convincing to me. It seems like one of those things that is all hype. Almost like fusion energy except I think fusion energy will actually happen one day.
J. Curtis
J. Curtis 5 aylar önce
Bravo! A tour de force of science education. You folks rock!
nosensibleusername 6 aylar önce
It may come as a shock, but my money is that ultimately, everything functions digitally. It's just the observational resolution that determines whether you perceive it as one or the other . . .
tenou213 9 aylar önce
I'm a little disappointed by the title but impressed by the content. It's less "we're building computers wrong" and more "old method is relevant in a niche application". There's also the eventual plans for fully commercial quantum supercomputing clusters and ever faster internet connections which might further limit the applicability of these chips going forward. However, building processing-specialized chips instead of relying on graphics cards seems really promising in the short term so long as the market stabilizes.
Eugene 8 aylar önce
The world needs AI artificial intelligence so all people don't have to work at all
No Name
No Name 9 aylar önce
Only when quantum computers can reliably run at room temperature, which isn't going to happen anytime soon.
lmauter 9 aylar önce
@Internet Tough Guy I actually stayed away from thus video for awhile because of the title, but because I always like the content of this channel, I finally decided to give it a watch
AnonyMoose 9 aylar önce
@KD it's not niche at all. ML is used in a ton of places. Any time you use "hey google/hey Siri", ML is used. Performed a Google search recently, had you email automatically filtered for spam? ML was likely used behind the scenes? How about your phone? Have you taken any pictures with it? Chances are if it was a modern, high end phone, it likely used some ML functionality for cleaning up the image. Netflix recommendation? ML. Social media recommended feeds? ML. TRvid recommended? ML. It's used everywhere - the above scenarios are only the tip of the iceberg. It's not niche at all. And on top of that, it's incredibly compute intensive - particularly when looking at total compute cycles across applications.
Douglas Mitchell
Douglas Mitchell 9 aylar önce
​@John Botris Well yes, getting more views is the object of all clickbait titles. It's still off-putting
Zak Seipel
Zak Seipel 3 aylar önce
I am really into synthesizers and especially modular… a friend of mine once asked if it could do my taxes for me. Now I think it just might be able to. + or - a few dollars
Hyper Baroque
Hyper Baroque 6 aylar önce
"about 1% error" Not when you're compounding errors with sequential math circuits. Three components off listed value by 1% working on the same problem don't merely add or multiple error. One of them could be acting as an integrand of the second component in relation to the third. The resulting area could be off by much more than a few percent of what the math shows.
mage 3 aylar önce
You can also reduce the error via mathematical magic. Continuous floating point rounding errors are definitely a thing, and have killed people before, but they've also been fixed by a simple software update rather than changing the hardware.
David Low
David Low 4 aylar önce
I think that the error of individual components is much less, the 1% is just a rough answer for the error of the whole system
But you have to remember too, that multiple redundant operations could be performed to mitigate this.
Christopher Pardell
Christopher Pardell 3 aylar önce
There were computational control systems built for military aircraft that used Fluidics. A blend of analog and digital computers that utilized high pressure oil flowing thru spaces in a stack of very thin metal plates. They performed Boolean operations as well as giving dynamic analog control of hydraulic distribution for control surfaces and the like. They would make an interesting topic for a related video. And they were immune from EMP in the event of a nuclear attack.
challa lalitha
challa lalitha 4 aylar önce
Where does quantum computing fit in this scenario.Great presentation and would like to know more about this topic.Excellent video.
Robert B
Robert B 9 aylar önce
As a guy who helps manufacture flash memory I find this really intriguing: especially because flash memory is continuing to scale via 3-D layering, so there’s a lot of potential, especially if you can build that hardware for multiplication into the chip architecture.
Raymund Hofmann
Raymund Hofmann 9 aylar önce
@Gavin Greenwalt So why have an algorithm and all the devices with sensors, radar, lidar, cameras and gps in the car then anyway? The responsibility is still on the driver, but now it is even harder to fulfill for him as he feels he is freed from it just to hit him harder when he doesn't expect it.
Raymund Hofmann
Raymund Hofmann 9 aylar önce
@Michiel van der Blonk If no one would be punished for doing something wrong that hurts the community, the community would fail. There would be no feedback regulating and limiting individual behavior to work for the community. Intent is hardly objectifiable. "Punishing intent" means tyranny of the people determining intent. VW dieselgate was objectively verifiable fraud, but i can assure you that their intent was "good". Because diesel engines emit less CO2 for the same power, they are deemed to "save the world" and save us from the "climate doom". Hiding the exhaust problem of diesel engines and not putting in the inconvenient and more expensive exhaust after treatment was done with good intent. The ends justify the means. You sound woke and clueless.
Mori 9 aylar önce
Hi: Unfortunately I can not speak English, so I have to use the program I have installed on my phone for translation, except that I can speak Persian, I do not speak any language other than Persian and no other language. I can not speak, so if I said words or sentences and the special program mistranslated into your language, I apologize in advance to your esteemed father, from the bottom of my heart and from the bottom of my heart, that the letters and words Call me to the end and call me a person orDo not consider me a rude person and do not consider me a rude person and judge me correctly and after reading my writings, just put yourself in my place for a few minutes and imagine yourself in my place, maybe if you have an awakened conscience and There was love and affection in your hearts, of course, if you did not have pride and arrogance, understand me and give me the right, and again, maybe, maybe, maybe you did something and you did it for me and you took me from You saved this great tribulation that I hope will not happen to any living thing, of courseI do not very much hope that anyone will take my hand and save me from this misfortune, but I am writing so that I can at least be comfortable in front of my conscience and not blame myself later if I cry out for this cruelty, oppression, and captivity that has fallen on me. All the doors are closed to me, even the eyes of God are blind to see the oppression that is being done to me, and the ears of God are blind to hearing the cries of my constant cries, my midnight cries, my daily sufferings, my daily prayers and the jurists and The cries of my every moment from this oppression, oppression and cruelty that from the beginning of my life, fromThe first events of my life that I remember are deaf. So how can I hope for others when God has done nothing for me and trapped me in a cruel, cruel, cruel, and oppressive family? I am Morteza, I am from Iran, I am blind, I am 34 years old, I am unemployed because in our country, Iran, there is no work for healthy people, let alone disabled people like me. I live with my family in a small town in Iran. Of course, in appearance, they are my family, but in reality, they see me as their own enemy, and with me, who am their child, only bI was born blind, and I was not to blame for this, but my parents, because they were illiterate, considered me as a disgrace only to myself for my blindness and disability, and were always tortured as severely as possible. They beat me to the point where they threatened me with a knife. They put me to sleep and put the knife on my neck, and I was terrified and scared. Get up and until fullThe thin skin of my hand burned, they would not let me go. And they tortured me many, many times, to the point that my brothers, with the support of my parents, tortured me in front of my parents' indifferent eyes, and told me that I was blind. You are and you should be tortured to the extent that they created the belief in me and in my mind that anyone who is blind or disabled should be tortured because it is a disgrace to the family and society. And unfortunately, their torture is still going on, and only a kind of tortureThey do not consider it for him, so how can I complain to someone who considers my law as his property, of course, the current laws in Iran. Unfortunately, the government does not have a place for people like me to go and live. I really have no choice, either I have to commit suicide or I have to stay in the same house that my family has turned into hell for me, under the severe mental and physical torture that my family inflicts on me. And more than ever with a horrible gradual death that tormentedI want to stay. So if you still have a little mercy, fairness, conscience, compassion, love, humanity, knowledge, ideology, humanity and altruism, help me, hold my hand, reach my cry. If you are in contact with institutions and organizations affiliated with the International Committee of the Red Cross, or if any of you are a member of human rights organizations or the International Committee of the Red Cross or Red Cross organizations in free and pro-human rights countries, listen to my writings, my voice and my cry. Come on, maybe they're a little fair and think of me and a way toSave me from the clutches of ostensibly wolves, of course, if human rights institutions and organizations and the Red Cross actually support human rights, if you are a citizen of a free country or a citizen of a free country like the United States, Canada, Australia, Great Britain, Norway, You have Sweden, Denmark, Finland, and the European Union. Send me invitations, arrange for me to leave this house, which is worse than hell, so that maybe I too can taste freedom. If you have capital, you are rich, you have money, at least help meLeave the messengers, so that all the photos on my personal pages in the messengers and social networks are from 8 years ago, that is, for 8 years ago and 5 years ago. I have not even traveled for many years, because our city is a small city. It is sad for a disabled person to leave home, it is sad for a disabled person, the streets of our city are not adapted so that a disabled person can easily be at least a little out of home. And I'm really depressed at home, especially with this family that is always torturingAre. Of course, I do not care if I travel or not, because my problems are so great that not traveling travels much more than my other or other problems. In your opinion, can a person who eats only one meal in 48 hours, which is a good meal, even a moderate meal, but also any garbage he gets, take pictures, travel or not, and other things? Slowly, someone who breaks his heart at every moment, of course, the work of my heart is no longer broken and my heart is on fire andIt is burned, and this fire is getting more and more hot and burning, and it is burning and ashes my whole being. So I hope that if you do not hear the sound of my heart breaking, at least be fair and feel the smell of my heart burning, away from arrogance, arrogance, misguidedness, ethnicity, race, skin color, shape and appearance for the sake of humanity and for Humanity and honor that you have, just imagine yourself in my place for a few minutes, see yourself in my place and think, if far away from you, God forbid, you would be in my place.Did you ??? Were you satisfied that someone was making fun of you or disrespecting you, or did you laugh at you in response to your message and pain and heart ?, so my dear friends and those who wrote my writings to you Please, if you do not want to do anything for me, at least do not make fun of me, do not laugh, and if you do not want to help extinguish the fire inside me that burns my heart, at least do not spill oil and gasoline with ill-considered language, words and expressions, and this is far from knowledge, honor, It is humanity and family originality. If you want to give me anyPlease help me do not send me a message on TRvid because I can not transfer messages from TRvid to the translation program that I have installed on my phone, and as a result, because I am not fluent in any language other than Persian, I can not understand the meaning of messages. Let me know what you sent me on TRvid. If you wanted and could and the possibility of helping me in any way, whether financially, materially or spiritually, please contact me on WhatsApp or Telegram, because in WhatsApp and Telegram I can easily send messages.00989358205228 00989182804420 00989358205228 00989182804420 00989358205228 00989182804420 00989358205228 😞😞😞
Gavin Greenwalt
Gavin Greenwalt 9 aylar önce
@Raymund Hofmann Uber explicitly removed emergency braking from the algorithm to avoid potential phantom braking events and relied on the human to brake in an emergency. The software identified a likely collision.
Michiel van der Blonk
Michiel van der Blonk 9 aylar önce
@Raymund Hofmann Why does someone always need to be punished? And how does punishing humans help? We still drink and drive. In the case of the Fukushima meltdown people died. The challenger blew up. There is only one thing you can do: investigate to make sure you improve technology in such a way it doesn't happen again. Punishing the engineers who made the mistake on the challenger improves nothing. What should be punished is malicious intent, like in the case of VW fraud. Computers have none.
Spiff McSparkles
Spiff McSparkles 6 aylar önce
To think I just got into analog synthesisers. Also this was the first and only time someone managed to explain machine learning in a way that clicked for me.
Devin Coleman
Devin Coleman 5 aylar önce
Sometimes it hits you that you’re a brain trying to comprehend itself. I had a lot of neurons firing while following and making sure I understood the basic model of what makes neurons fire. And now my neurons are instructing my hands to type this, using neurons that are well trained in dexterity related to smartphone use. This is turning into a Stanley Parable-type loop so I’ll be done :)
John Knight
John Knight 3 aylar önce
Could analogue computing be used in any part of the crypto mining process to make it more energy efficient? If I can give some constructive criticism (which I normally wouldn't do), the section on AI and neural networks goes on far too long of a tangent. You would be better off giving a brief summary of that computing scene, then incorporating more parts in finer detail as you show how analogue computing can be used to help out in AI and neural networks.
yashvi sharma
yashvi sharma 4 aylar önce
I just want to say thank you. These videos really inspire me and it's beautiful taking in this knowledge. Thanks for all your hard work
Terry Talks
Terry Talks 9 aylar önce
As a 70-year-old boomer my technical education involved building and testing very basic analog devices. Thanks for this video, it helped me to a better understanding of neural networks.
wajahat ali
wajahat ali 9 aylar önce
@Reddy Anna same bro i am 22 and my life sucks
Terry Talks
Terry Talks 9 aylar önce
@Reddy Anna You have plenty of road left to travel, follow what you love, enjoy the journey, don't let the bumps in the road stop you and if you can get a soul mate to share it with you it will all be good.
Vource 9 aylar önce
Yep your 70 mate
Reddy Anna
Reddy Anna 9 aylar önce
@Terry Talks Good for you my man! I am still 20 and don't know what to do in life :(
Terry Talks
Terry Talks 9 aylar önce
@Magma Thanks for the heads up. I've got a good women, 6 children, 8 grandchildren and a recently placed stent in my heart that keeps me going :)
This is why we can't have nice things
How Electricity Actually Works
görünümler 7 200 000
The Absurd Search For Dark Matter
görünümler 6 000 000
When (and how) to watch Season 3
görünümler 778 568
The Longest-Running Evolution Experiment
The Universe is Hostile to Computers
görünümler 17 000 000
How Hidden Technology Transformed Bowling
The 4 things it takes to be an expert
When (and how) to watch Season 3