Future Computers Will Be Radically Different (Analog Computing) 

Abone ol 14 Mn
görünümler 11 Mn
99% 339 000

Visit brilliant.org/Veritasium/ to get started learning STEM for free, and the first 200 people will get 20% off their annual premium subscription. Digital computers have served us well for decades, but the rise of artificial intelligence demands a totally new kind of computer: analog.
Thanks to Mike Henry and everyone at Mythic for the analog computing tour! www.mythic-ai.com/
Thanks to Dr. Bernd Ulmann, who created The Analog Thing and taught us how to use it. the-analog-thing.org
Moore’s Law was filmed at the Computer History Museum in Mountain View, CA.
Welch Labs’ ALVINN video: • Self Driving Cars [S1E...
Crevier, D. (1993). AI: The Tumultuous History Of The Search For Artificial Intelligence. Basic Books. - ve42.co/Crevier1993
Valiant, L. (2013). Probably Approximately Correct. HarperCollins. - ve42.co/Valiant2013
Rosenblatt, F. (1958). The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain. Psychological Review, 65(6), 386-408. - ve42.co/Rosenblatt1958
NEW NAVY DEVICE LEARNS BY DOING; Psychologist Shows Embryo of Computer Designed to Read and Grow Wiser (1958). The New York Times, p. 25. - ve42.co/NYT1958
Mason, H., Stewart, D., and Gill, B. (1958). Rival. The New Yorker, p. 45. - ve42.co/Mason1958
Alvinn driving NavLab footage - ve42.co/NavLab
Pomerleau, D. (1989). ALVINN: An Autonomous Land Vehicle In a Neural Network. NeurIPS, (2)1, 305-313. - ve42.co/Pomerleau1989
ImageNet website - ve42.co/ImageNet
Russakovsky, O., Deng, J. et al. (2015). ImageNet Large Scale Visual Recognition Challenge. - ve42.co/ImageNetChallenge
AlexNet Paper: Krizhevsky, A., Sutskever, I., Hinton, G. (2012). ImageNet Classification with Deep Convolutional Neural Networks. NeurIPS, (25)1, 1097-1105. - ve42.co/AlexNet
Karpathy, A. (2014). Blog post: What I learned from competing against a ConvNet on ImageNet. - ve42.co/Karpathy2014
Fick, D. (2018). Blog post: Mythic @ Hot Chips 2018. - ve42.co/MythicBlog
Jin, Y. & Lee, B. (2019). 2.2 Basic operations of flash memory. Advances in Computers, 114, 1-69. - ve42.co/Jin2019
Demler, M. (2018). Mythic Multiplies in a Flash. The Microprocessor Report. - ve42.co/Demler2018
Aspinity (2021). Blog post: 5 Myths About AnalogML. - ve42.co/Aspinity
Wright, L. et al. (2022). Deep physical neural networks trained with backpropagation. Nature, 601, 49-555. - ve42.co/Wright2022
Waldrop, M. M. (2016). The chips are down for Moore’s law. Nature, 530, 144-147. - ve42.co/Waldrop2016
Special thanks to Patreon supporters: Kelly Snook, TTST, Ross McCawley, Balkrishna Heroor, 65square.com, Chris LaClair, Avi Yashchin, John H. Austin, Jr., OnlineBookClub.org, Dmitry Kuzmichev, Matthew Gonzalez, Eric Sexton, john kiehl, Anton Ragin, Benedikt Heinen, Diffbot, Micah Mangione, MJP, Gnare, Dave Kircher, Burt Humburg, Blake Byers, Dumky, Evgeny Skvortsov, Meekay, Bill Linder, Paul Peijzel, Josh Hibschman, Mac Malkawi, Michael Schneider, jim buckmaster, Juan Benet, Ruslan Khroma, Robert Blum, Richard Sundvall, Lee Redden, Vincent, Stephen Wilcox, Marinus Kuivenhoven, Clayton Greenwell, Michael Krugman, Cy 'kkm' K'Nelson, Sam Lutfi, Ron Neal
Written by Derek Muller, Stephen Welch, and Emily Zhang
Filmed by Derek Muller, Petr Lebedev, and Emily Zhang
Animation by Ivy Tello, Mike Radjabov, and Stephen Welch
Edited by Derek Muller
Additional video/photos supplied by Getty Images and Pond5
Music from Epidemic Sound
Produced by Derek Muller, Petr Lebedev, and Emily Zhang



28 Kas 2023




Yük bağlantısı.....


Çalma listem
Daha sonra izle
YORUMLAR : 13 B   
@5MadMovieMakers Yıl önce
Hyped for the future of computing. Analog and digital could work together to make some cool stuff
@teru797 Yıl önce
True AI is going to be the end of us. Why would you want that?
@kalindibang9578 Yıl önce
@@teru797 true AI wont be possible for the next 200 years and by then if humanity kept on living how they are we aint gonna survive anyways
@@teru797 it would still take quantum computers to be able to have the memory necessary to run
@jpthepug3126 Yıl önce
@@teru797 cool
@@teru797 we are already the end of us
@DomDomPop Yıl önce
It’s funny, for those of us who are into electronic music production, analog never left! There are lots of great analog synthesizers out there that can produce all kinds of complex waveforms, and some of us have been known to tack an oscilloscope on to modular gear to view and use those waveforms. Even some relatively simple gear can produce complex, “3D” structures with the correct cable patches. A lot of what you described at the beginning is the backbone of synthesis for music, and the same principles obviously apply to mathematical operations.
@rogerphelps9939 9 aylar önce
You can do everything digitally that an analog system can do and more. An example is resampling in order to change the frequency scale of a recording. This can be done in real time using digital methods, not so much for analog methods.
@DomDomPop 9 aylar önce
@@rogerphelps9939 Depends on what you’re doing and what’s important to you. Analog synths are great for experimenting with the knobs and patch bay (if available) and learning what exactly each change has on the overall waveforms. They’re really great for learning what exactly you’re doing and what you’re getting as a result. Yeah, there are software synths meant to emulate hardware knobs and a patch bay, but I haven’t found clicking through all that as valuable as plugging and experimenting yourself. That stuff really depends on the person, though. What doesn’t depend on the person, and is arguably more important, is the fact that aliasing can end up being a problem on digital synths. When you start doing some crazy cross modulation between sources and/or you’re dealing with lots of harmonics, if the processor can’t keep up, your sound will suffer. Same with super high frequencies. Depends on the synth, of course, but analog synths can tend to have a warmer, purer sound to them as well, because you don’t have to emulate all those harmonics. It really comes down to the same arguments being made here regarding analog computers: there’s no processor overhead needed to create some very complex shapes, and to do so perfectly accurately, on analog. I use both types of synths, as lots of people do, and I would never say that one somehow makes the other unnecessary. Hell, there are hybrid synths that give a mostly analog signal path while allowing for, say, a digital sample and hold circuit and the ability to save certain parameters. People make those kinds of things for a reason, you know?
@victorpereira8000 9 aylar önce
Pythagoras discovered math with music I think right? Really like your comment
@RAndrewNeal 9 aylar önce
@@rogerphelps9939 Difference is that you need billions to trillions of transistors to do digitally what can be done using tens to hundreds of transistors analogously.
@rogerphelps9939 9 aylar önce
@@RAndrewNeal Wrong. The errors arising from component tolerances, noise and temperature dependent offsets make anything complicated pretty much impossible in analog. Transistors in digital processors are extremely cheap. Provided you have good DACs and ADCs you can do anything to whatever precision you need in digital.
@williamtell1477 9 aylar önce
AI researcher here, you did a great job on this. For anyone interested the book Perceptrons by Minsky/Papert is a classic with many proofs of limitations and explorations of the limits of the paradigm. It still holds up today and its fascinating to read what scientists were thinking about neural networks during the year of the moon landing!
@Musbiq Aylar önce
Great recommendation. Thanks.
Funny, I always tought of the positronic robots of Asimov to be analog computing, but as a programmer, it was difficult to understand how to work with analog instead of binary, but this video makes a lot of sense, and I can see how the combination of voltage and frequency can influence the result and the combinatory power of multiple inputs can, with each weighting differently in the scale, determine the final result. I only understand a glympse, I know, but my imagination and this video allowed to see how it can relate to our brain neural network. Amazing!!
@aoeu256 11 aylar önce
Analog has some sort of an error factor, but that error factor can be used for good in terms of evolutionary algorithms.
@mtgatutorials368 10 aylar önce
The world is NOT Digital aka quantum, but it is Analog. These machines will prove this fact and change how we come to see reality.
@RAF-777 10 aylar önce
@@mtgatutorials368 I am not 100% sure, but the quantum computing seems a bit like analogue of the analogue computer the similar principals, but the scale is also much smaller than the analogue computer - the analogue computer is using thousands - millions of atoms to represent the value - the quntum one - operating on the single atom's, electrons, photons and other particles to represent the value, and perhaps also the quntum c use engagement, which I am think no one is quite sure how it works so they explain it using the nonsense ideas, like the communication quicker than light speed. They try to use the paradoxes to explain such possiblity, but they still nothing explaining still breaking the basic phisics law: nothing is quicker than light never ever. But I am not sure 😃
@mtgatutorials368 10 aylar önce
@@RAF-777 oh, I can explain it really easy. 4 spatial dimensions, in an expanding HyperTorus and HyperSphere
@cdreid9999 10 aylar önce
@@aoeu256 no analog doesnt. He was describing our implementations of analog. There is no inherent theoretical error rate in analog systems..they are in fact theoretically perfectly accurate while digotal cant be. An analog circuit could theoretically carry the Precise value of pi. A digital one cant
@LuisCasstle 9 aylar önce
So Rosenblatt was practically decades ahead of his time, imagine if he was around today what he could achieve with modern computers. I am always fascinated with people who lived long ago that had ideas that took decades to centuries to be proven right due to lack of technology. Similar to a time traveler that landed in the wrong era and had to work with what was avilable, a la back to the future.
Analogue was never meant to die; the technology of that time was the limiting factor IMO. It appears like Analogue - Digital hybrid system can do wonders in computing.
@NotWhatYouThink Yıl önce
Great episode. Hadn’t considered the mix of digital and analog computers in a complementary fashion. I guess it’s not what I thought!
@WeponizedAutism Yıl önce
True, but the actual impact of this is not what you think.
@mushin111 Yıl önce
Jesus, could you astroturf a bit harder please?
@LeoStaley Yıl önce
Until the 90s, US war ships used mechanical calculators to calculate aiming the guns, something that would be perfect for your channel.
@deusexaethera Yıl önce
I see what you did there.
@dieSpinnt Yıl önce
BS! Fourier ... ROTFL
@ShallowEra 5 aylar önce
Now imagine the binary transistor switch being replaced by atomic valency and how much will become possible with quantum computing storing data with DNA and RNA coding.
@koborkutya7338 11 aylar önce
i recall our control system teacher at the university in the '90s said Space Shuttle flight controls contained analogue computing because it had to process like several thousand sensors' input to produce outputs and digital was just too slow for the job.
@rogerphelps9939 9 aylar önce
He lied.
@TARS-CASE 5 aylar önce
@@rogerphelps9939 the Space Shuttle did indeed use analog computing for some of its flight control systems. the Space Shuttle used a hybrid digital/analog system for flight controls. Most of the high-level control logic was handled by digital computers, but critical low-level control functions were performed using analog circuits. The analog components were able to process sensor inputs and produce control outputs much faster - on the order of microseconds - compared to even the fastest digital computers of the era, which took milliseconds. This speed was essential for stability during flight.
@pbinnj3250 Aylar önce
I cannot express all of my appreciation for this video. I understood it and I gained an enormous amount from it. If I sound unduly excited, it’s because I thought this stuff was beyond me. Thank you.
@TheCaptainCrack 11 aylar önce
I like the idea. Its like letting the universe/physics do all the (computational) work for you. Now I bet thats a turn of events it didnt see coming!
@aoeu256 11 aylar önce
Actually are quantum computers closer to the universe I think.
@reh3884 10 aylar önce
Quantum Theory says the universe is digital not analog.
@johndoh5182 10 aylar önce
Umm, either way a team of engineers have to design a solution. And because digital systems are crazy fast and their main issue is how slow main memory is, it's easier to write a program for it to solve than design an analog system to represent the math problems you're trying to solve. Either way the engineers had to know the math and any data that's needed. So, the universe that exists in the big brains of those engineers is solving the problem one way or another and programming a digital computer is easier than figuring out an analog solution. It's less steps to code the math directly into a program.
@xahst 10 aylar önce
considering both analog and digital methodologies exist in our universe, I'd say both methods use our universe's laws extensively lol Literally anything that is possible in our universe, the universe supports lol
@j-davis7290 10 aylar önce
Digital generalized computational power used to create highly specific analog computers and having some form of intercommunication between the two is a logical next step
@adamkallaev3573 11 aylar önce
If it makes my graphics card cheaper, I'm all for it
Finally, a fellow PCMR member
@cdreid9999 10 aylar önce
you dreamer you
@jerycaryy4342 9 aylar önce
@@hridayawesomeheart9477 finally, an average redditor
@BlurryDrew 9 aylar önce
It sounds like it could make GPUs more power efficient. GPUs are starting to use AI to make certain computations more accurate, so maybe an analog chip on our GPUs could handle that instead.
@notisike3553 9 aylar önce
@@BlurryDrew I agree, but the first major bottleneck is, like he said in the video, the massive power requirement to teach each AI, each needing 3 household's combined annual energy usage, mass production seems inefficent.
@marsgizmo Yıl önce
amazing episode, well explained! 👏
@solarwolf678 Yıl önce
@bagel7860 11 aylar önce
It makes sense, given how our brains don’t have “clock speed”. The entire makeup of cells operate autonomously
@gtVel 11 aylar önce
Fearless Joy Wouldn't a human brain be nothing more than an incredibly complex analog computer made of biomass?
@aoeu256 11 aylar önce
​Fearless Joy Computers will get closer to human brains if we merge CPU&memory like in a cellular automata model
@violante1421 10 aylar önce
Fearless Joy what is it then?
@alexcapek561 10 aylar önce
Fearless Joy The human brain is 100% a biological computer.
@timotheesoriano 11 aylar önce
What a great video. This is the first time I clearly understand how we got to the deep neuronal algorithm because most of the time we learn about what it is and how it works, but here I see what were the fundamental ideas that lead to it. Besides, going back to analog but with newer technologies is fascinating, like if we discover our world again. Digital computing was a revolution and there is a new revolution happening here, certainly using both principles (with AD or DA convertors in between). Even if lots of researches go to the quantic domain but which is unmatured yet, the analog computation already allows a lot. Also, as an engineer in the space domain, I see here an extension of what we're doing all the time: acquisitions values are often provided in the form of a voltage with a calibration curve allowing the interpretation. So we commonly use analogical signals and computations. We even use matricial analog computations, but most of the time for a 2x2 matrix (example: you have 2 switches giving theirs positions through resistance values, if the two switches have different scales of resistance, then a single signal holds the information about the 2 positions). So we do use analog signals, able to transmit a more complex information that what digital ones do. But here it goes a great step forward because it applies to matrix computing. This application is going to revolute computing science and that may happen quite soon because all the fundamental science and technology is already known.
@beneehayah4401 9 aylar önce
The brain functions as both analog and digital, depending on which part of the brain is involved. Neurons are digital while synapses are analog. There was a woman who was working on her Phd thesis 12 years studying the brain to develop an integrated circuit which mimics how the brain functions. She arranged the transistors and orher structures to be similar in design to the brain. This IC chip became a visual chip with the goal of implanting in blind humans to restore some level of sight. It worked well in human test cases
@snerttt 2 aylar önce
I'd be interested to see a digital computer adopt an analogue component, possibly to be utilized in situations of physics simulation, much like how a GPU is utilized to independently create graphics from the CPU.
@lc5945 9 aylar önce
I remember the first time I heard the term "deep networks", it was back in 2009 when I was starting my MSc (using mainly SVMs), a guy in the same institute was finishing his PhD and introduced me to the concept and the struggles (Nehalem 8 cores era)...the leaps in performance made in NN since then thanks to GPGPU are enormous
@laveeyang2797 Yıl önce
20 years ago, my computer science professor George Davida said analog will make a come back for complex calculation. Looks like he's right!
@bz3086 Yıl önce
Lavee, Yeah. But his stock ownership is 99% AI TECH.
@megapet777 Yıl önce
what a genius
At least for anything that involves matrix multiplication. I actually wonder if we should start thinking of how both digital and analog can work together.
@jaycee8733 Yıl önce
@@blah-blah-blah4715 that is actually true. There is no good or bad system(mostly). They are just better suited for different jobs according to their quirks.
@TB-jl9fr Yıl önce
Don't know man. With the birth of FPGA technology analog is still to unflexbile and also just single purpose. With digital components become more and more efficient i don't really see a true comeback of analog at all.
@harsha_m69 9 aylar önce
Here’s my input: As a student of Computer Science who researched and wrote a couple of papers on Quantum Computers, I confidently feel Quantum Computing is the future, however, I understand that for them to function properly, you need analog circuitry. Because Quantum computers cannot operate like regular computers, they need specific temperatures and wiring and so. It’s like we’re finding pieces for computation, we just need to glue them together to achieve perfection and forever change the course of computation, however in this journey, there are a lot of these missing pieces and as we find one, our final puzzle’s shape gets even more pronounced, leads to a different direction than we originally anticipated.
@mikeyunovapix7181 5 aylar önce
Imagine applying a combination of digital and analog into a gaming PC where it would get a whole boatload of benefits.
@thebeezkneez7559 6 aylar önce
Analog computers are art to me whereas digital computers are almost soulless. Maybe that's weird but the concept of making a computer that is analogous to a certain real world concept is interesting.
@DizzyDad Yıl önce
In Star Trek Next Generation, the android Data had a brain consisting of a "positronic matrix." He created a daughter with a similarly designed brain, who subsequently died due to instability of the matrix. The show started in 1987. Although Data had digital components (storage 800 quadrillion bits, or about 100k TB), they did not imply in any way that he was purely digital. I think it's cool that they recognized back then the combination of analog/digital when creating the futuristic android. They simply called it the made-up "positronic matrix" to fill in the gaps.
@rogerphelps9939 9 aylar önce
They recognised no such thing. They made up some gobbledegook that they thought would play well to the audience.
@asg32000 Aylar önce
I've watched a lot of your stuff for years now, but this is the best one by far. Great job of explaining something so complex, difficult, and relevant!
@ModernBuilds Yıl önce
Your videos a are always awesome and the fact that even I can comprehend them is amazing 🔥
I love Science, so randomness-be-damned: I ask around if someone wants some scientific Watch-Suggests, cause Learning never ends.
@yvettedath1510 Yıl önce
AI they want AI to control humanity
@Paul-rs4gd 11 aylar önce
I can see this analog technology being used in special purpose AI processors attached to normal digital computers. It makes sense - they could provide very large scale, cheap and energy efficient Neural Net acceleration. Since it appears that 'scale' is the most important thing for AI, it is really important to bring down the cost and energy consumption, so we can all run GPT3 on our laptops :)
Awesome series, so cool to see the potential for analogue and the breakdown of the technology. My theory is that when robots finally achieve sentience, they will be much like us.
@maxismaximov2736 10 aylar önce
This is what I call working smarter, not harder Instead of forcing a conventional CPU to have smaller structure and perform operations faster, they utilize the physics of electricity used in the processors to compute faster
@HrLBolle 3 aylar önce
Mythic Gate's approach to work: Kind of reminds me of the copper filament memory planes with ferromagnetic rings representing the bits used as memory for the AGC (Apollo Guidance Computer). The video on which this memory is based was released by Destin, aka Smarter Every Day, and accompanied his and Linus Sebastian's meeting with Luke Talley, a former IBM employee and, at the time of the Apollo missions, a member of the data analysis teams responsible for the Analysis, evaluation and processing of the telemetry data received from the Apollo instrument ring.
@SuperNovaJinckUFO 3 aylar önce
After doing a bachelor's degree that focuses on solving differential equations digitally, and learning just how difficult that can get, the intro to this video blew my mind
@ElectroBOOM Yıl önce
Awesome information!
@Mani_Umakant23 Yıl önce
I gave you your first like 😁
@N____er Yıl önce
@@Mani_Umakant23 Why would you like such an unoriginal comment that provides so little value or thought?
@Mani_Umakant23 Yıl önce
@@N____er Aise hi sexy lag rha tha.
@@N____er Don't say anything bad about ElectroBOOM he is such a wonderful creator
Hi sir I am a big fan of yours \
This is really cool, and there is another startup that have a different approach using analog but instead of using voltage and currents, they use light, so it is really interesting how the analog is coming back. I would really appreciate it if you would make a video about this. The startup is Lightelligence. As always, thanks for these videos.
@bishalpaudel5747 5 aylar önce
This is very well explained video on analog computing. Never could I have thought the topic of analog computing can be put out in 20 minute video with such a phenomenal animation and explanation. Respect your work and effort to make science available to all for free. Respect 🙏
@cocoritosss8669 Yıl önce
It looks efficient. Although a self-taught machine may need a way to tweak the conductance values on the fly, which is a interesting question mark... anyway love the idea of analog flash memories used as NPUs. It looks like a fpga.
@photorealm 3 aylar önce
When I started thinking about artificial neural nets, I just assumed they would really only happen on specialized analog computers in the future. Then google and others along with more powerful digital computers made it work pretty darn great. I love being in this time of history, watching so much science fiction slowly become reality.
@BiblicalBasics 8 aylar önce
A superb piece, thank you! Yes, maybe this is the way forward for AI in order to consume all the world's electricity.
@anishsaxena1226 Yıl önce
As a young PhD student in Computer Science, your explanation of how neural networks come to be and evolved, and the math behind it, is the cleanest and most accessible that I have come across. As I focus on computer architecture, I came to this video without much expectations of learning anything new, but I am glad I was wrong. Keep up the great work!
@deepblue3682 Yıl önce
From USA?
@alex.g7317 Yıl önce
There’s a reason he has 11, 000, 000, 000 subscribers after all 😏
@unstable-horse Yıl önce
@@alex.g7317 Wow, that's more than the population of Earth. Where does he find all those subscribers??
@exoops Yıl önce
@@unstable-horse Mars
@alex.g7317 Yıl önce
@@unstable-horse omg, lol 😂. That was a typo! I meant 11, 000, 000!
As an engineer, it is our job to apply right technology either analog, digital or both to solve engineering challenges.
The main problem with analog is saturation of current or voltage hence inducing lost of linearity. One way to resolve it as he said is to to reconverte to digital and and do amplification by a good factor
@mdzaid5925 5 aylar önce
I feel that analog will make a very strong comeback, but only in specialized applications. For general purpose computing, digital will retain it's dominance.
@GGSHeadoR 2 aylar önce
Congratulations on repeating what you just heard in the video.
@GrassXMagnum 2 aylar önce
​@@GGSHeadoRconsidering most people comment before even finishing watching the video, there's a chance they didn't actually listen to that part 😅
@schrodingerscat1863 2 aylar önce
It never went away, just became easier to model simple stuff purely in the digital domain. Some operations were always easier to model using analogue components sampling and displaying the results using digital computers.
@mikeall7012 Yıl önce
As part of my job, I work on a lot of control systems. Analog controls have a far superior accuracy when compared to digital. The sampling rate limitations of digital can be a problem. But analog systems are finicky and drift, requiring a lot of maintenance. Plus digital systems are easier to make redundant. But digital can never hold a candle to the infinite sampling of analog.
@timobakenecker7314 6 aylar önce
This video really has put new aspects to my knowledge of AI in total. Thanks for that!
@iantaylor7456 Yıl önce
Fun fact, that Graphics Card he’s holding at 18:00 is a Titan xp with an msrp of $1200, he says it draws 100w but it actually draws about 250w, so that tiny chip that only draws 3w is even more impressive
@Matthew-rl3zf Yıl önce
Let's hope these new analog chips can solve our GPU shortage problem 😂
In general when doing machine learning you are only using the CUDA cores of a graphics card so the wattage never gets close to its maximum. A lot of the processing units are simply not being used, for example shaders and 3D processing units. For example on my GTX 1080 I sit between 60-90w out of 200w when doing Pytorch machine learning. So I think 100w out of a maximum effect of 250w seems reasonable.
@chrisoman87 Yıl önce
you can underclock GPU's, thats what they do in cryptomining to improve their profit margins, depending on the chip they can operate effeciently at a fraction of their nominal power
@AC3handle Yıl önce
man, I'm old enough to remember when a $1200 card was considered EX PENS >IVE< And not...'going price'.
@chrisoman87 Yıl önce
@@AC3handle 1200 wont buy you enough power for a decent DL rig either. An RTX 3090 goes for ~$3000 USD
First comment ever on the channel! Although had been a big fan since long. This episode had been very close to my heart as being a computer scientist myself, the topic touches me greatly. I always believed that digital computing might soon be too slow to sustain the scientific growth as chips are approaching almost atomic level and large entropy itself that comes with binary conversions for large calculations. I believe to go subatomic we would need combination of digital and analog computing and hence the true power of quantum computing would be unleashed. Although, this might have a very interesting effect on Co-NP problems such as factorization (eg. RSA - cryptography). Would love a video on this topic, what happens if quantum computers able to decode cryptographic algorithms in polynomial time. what would be the consequence on digital security ?
@4dverse Yıl önce
I think analog systems will fall short in AI-based probabilistic systems where the exact probability value matters.
@NoahSpurrier 9 aylar önce
I remember seeing an analog differential calculator in high school in my physics and electronics teacher’s lab. It was more of a museum piece. It was never used. RIP Mr. Stark
@elliott8596 9 aylar önce
To be fair, many of the tools we use are analog. We just don't call them "analog computers"... even though, they kind of are.
@rogerphelps9939 9 aylar önce
Exactly. Museums is where analog computers belong.
@gregseljestad2793 6 aylar önce
I just found out that the SR71 engines had a hydraulic computer that ran the system. That would be amazing to see. I worked at Caterpillar and a friend of mine was tasked with converting a craper transmission module from a hydraulic base to electronic. It was a very old design and all engineers had passed on. They had a team of engineers that had to replicate all hydraulic functions into an electrical equivelant. It was fascinating to me. One of the functions they had to replicate is going up a steep hill with a full load and being able to shift without rolling backwards. Holding the load, sharing the load with two clutches, and increasing one clutch while reducing the other clutch to make it a seemless shift. So enjoy this topic. Thanks!
@NovaPax 9 aylar önce
From my understanding, we're kinda doing the same thing in quantum computing. The "quantum time crystals" made by Google are somewhat like the analog chips, and we run digital checks on the data once it's done. I've also seen people layer GPT with Wolfram Alpha and other more specialized programs to improve accuracy.
@Septimius Yıl önce
I see Derek is getting into modular synthesizers! Also, funny to see how the swing in musical instruments to go from analog to digital and back is being mirrored in computing generally.
@paradox9551 Yıl önce
My first thought when he pulled out the analog computer was "Hey that looks like a modular synth!"
Witness Audio Modeling (search for it on TRvid).
@p1CM Yıl önce
Music has always been an AI task
As a semi-professional music producer with almost half a decade of working with professional musicians I would agree - and this is mainly because people feel a lack of “soul” in music. Those small human errors that we’ve spent decades trying to get rid of with Autotune, Drum machines, Sequencers, digital synthesiser and digital samplers (the last two CAN create sounds that will always come out the same way as long as the input stays the same - however there are exemptions). This is probably something the people I know in the music industry refer to as “The generation-rule”, in brief the music today is a result of what our parents and grandparents heard combined with new technologies and pop culture. - If you’re interested in music and maybe want to stay ahead of the game look it up. Some refer to it as the “30 year rule” as well.
@PetraKann Yıl önce
@@p1CM AI has no tasks
@lancegraham3344 10 aylar önce
Robotic motion seems like something that could benefit from the analog approach
@ralfsobe5529 8 aylar önce
I like that ! This is really an much stronger Chip for programmable neural networks than digital ones !
@woody5109 10 aylar önce
Cell phones used to be analog, you could always here the other part. Analog phones only required a 50% signal to maintain a connection, so you could always hear the other party, even in heavy static. This was so handy in poor atmosphere conditions that the military wanted it for themselves. We got stuck with digital that drops your call if the signal drops below 98%….🤔
@shrimpbisque Yıl önce
People say combining classical and quantum computing into one machine would be great for calculations; a quantum processor for finding solutions and a classical processor for checking them. I think combining analog and digital into one machine could be similarly useful, since they're both suited to different types of computation.
@soundspark Yıl önce
Could analog computing be used for graphics processing, where a slight bit of noise/graininess might not be an issue, and may even help dither out banding that plagues too many games?
@belsizebiz Yıl önce
For amusement only: My first day at work was in1966, as a 16 year old trainee technician, in a lab dominated by a thermionic valve analogue computer (or two). These kept us very warm through the winter months and cooked us during the summer. The task was calculation of miss distances of a now-obsolete missile system. One day I was struggling to set up a resistor/diode network to model the required transfer function, but the output kept drifting. I found the resistors were warming up and changing value. Such are the trials and tribulations of analogue computing....
@_a_x_s_ Yıl önce
Thus the temperature coefficient is very important for recent precision devices. And a high accuracy low ppm resistor is expensive, which is one of the reasons why it costs so much for high-end electronics instruments.
I was going to comment that one disadvantage of analog computers is keeping them calibrated. If you want a precise amount of 'voltage' or movement to represent a real-world value, you have to keep it calibrated. Older mechanical ones had wear/tear, electronic ones have issues as well.
Ah!? But what if the resistors were warming up, digitally?
@Cat-ir8cy Yıl önce
@@stefangriffin2688 you can't have a digital resistor
@@stefangriffin2688 yeah digital signals works with gates - on or off
@manUNITED986 11 aylar önce
I work in AI, this has been a discussion for quite some time. Love to see some proof of concept here.
@xanthoptica Yıl önce
As others have pointed out, biological brains have both analog and digital components. The fact that there's a current threshold that opens ion channels gives them a "digital-like" output (though that output is usually a firing frequency rather than just on or off, which is kinda analog). The inputs into neurons, with stimulatory and inhibitory synapses, different numbers of synapses for an input from another neuron, different distances between synapses on a complex network of dendrites...buncha analog integration there. The AI approach of using analog for raw and complex calculation, but passing through a digital phase (filter?) to get some noise reduction may be a better analogy (sorry!) for a biological neural net than it first appears.
@mallapurbharat 9 aylar önce
Amazing, thought provoking 2 part video on analog computing. Veritasium never disappoints!
@Tony770jr 9 aylar önce
I worked with machine learning applications 6 years ago on resource constraint microcontrollers. After understanding how neural networks actually worked I came to the realization that an analog equivalent would operate much faster. I mentioned this to my engineering manager at the time and he laughed at the idea. But I knew I was right!
@nicholassauer2612 10 aylar önce
As a kid, i preferred digital over analog. As an adult studying to be a certified electrician: I love analog. Theres something beautiful about analog electronic components and stuff like vacuum tubes. Hell, audiophiles still use vacuum tubes for audio amplifiers... claim it produces a better sound.
@igvc1876 10 aylar önce
Perhaps that's because our brains are analog, and analog obviously prefers analog
@dust7962 Yıl önce
The problem with this system of computing is that interference is a huge factor. When you only test to see if there is voltage or not then you don't need to worry about interference. But when you get into building systems that use varying voltages say 0v 0.5v 1v then you need to worry about interference, and the more numbers you add the greater this factors into being an issue. Interference could come in the form of microwaves, radiation, mechanical vibration (think fans cooling off a server rack,) and the list drags on as almost anything can cause interference. That ostiliscope used in the example is an expensive piece of equipment that minimizes interference, but the cost is far higher than with a binary number system.
@introprospector Yıl önce
Binary computers have to deal with interference too, that's handled by error correction. Error correction is already baked into the infrastructure of every digital component, to the point where we don't realize it's there. They suggested one method of error correction in the video, and they're probably not even scratching the surface of what's possible.
@fltfathin Yıl önce
i think the crux is the medium, the AI models brain which is so good at re-building itself, and it uses electron and chemicals to convey information. our transistors are too limited to mimic that interaction. for example the new 3w chip needs to be custom made for each model if i got it right.
@dust7962 Yıl önce
@@introprospector Yes, but with binary error correcting is simpler as interference isn't as much of a burden on the architecture. When the job is to check if there is, or isn't voltage it is a lot less complex than checking 8 different voltage thresholds.
@dust7962 Yıl önce
@@fltfathin This is called an ASIC (Application specific circuit) the computer is pretty much just sent to the landfill after it's outlived it's usefulness instead of having the ability to be repurposed. Which is another concern about where computing in general is heading as PCBs use less and less semi-precoius, or precious metals there is less incentive to recycle.
@JayJay-dp8ky Yıl önce
@@dust7962 Yeah but I put my mobo in the case first and then the radiator wouldn't fit, so I had to take it out and install the radiator first. It was really annoying. I didn't watch this video, but I'm assuming this is what he was talking about.
@yassamena17 6 aylar önce
The explanation on how neurons fire or don’t fire is incredible.
@doctoroctos Yıl önce
Digital is either 0 or 1. Analog is anything in between. Programmable resistors in a mac function is a cool concept. How would the system control the reliability and repeatability given age, temp, process variation, defect growth?
@CloudyDaze Yıl önce
This is the first discussion of Artificial intelligence that I've heard that didn't made me roll my eyes and actually had me not only entertained but intrigued. Well done. C:
@zynzy4u 10 aylar önce
Analog functions realized using DSP engines... There is a better way to train and create neural networks and digital analog hybrids. When I was the leader of advanced R&D at an AI company before I pissed management off and they me, the unique ideas and development of these techniques were progressing nicely. Algorithms to sort data into identification so efficient they can almost be done by hand in real time. I use these methods for all my personal research. With these have made stunning discoveries which have left me both incredibly enlightened and absolutely horrified. Such is true understanding.
@VarunGupta3009 9 aylar önce
I have always wondered why analog wasn't used in more applications given that it is instant and we hardly require the amount of precision that digital offers. I also drafted up a document long ago that tried recreating many often used computer algorithm applications that would be faster (rather instant) in analog with dedicated chips. Think about it: almost all of the digital content we consume is _compressed_ media in the form of audio+video. Most of this compression is lossy, from Instagram, to Spotify, to TRvid. Why not have dedicated analog processing chips for AV and AI, thereby removing 90% of computational load from the CPU. In the future, we may as well have one CPU and multiple APUs (analog processing units) on devices :)
@lonewulf0328 Yıl önce
This was one of the best layman's explanations of neural net training models that I have ever seen. Awesome content!
@duongchuc1834 Yıl önce
@patakk8145 Yıl önce
but it isn't, he literally said he's going to skip back propagation (which is how models are trained nowadays)
@PaulAVelceaVSC Yıl önce
i am a layman I did not understand a bit of it, pun intended
@fildorian6867 Yıl önce
I am fully convinced that the "mystery of our brain" does only exist because we don´t yet know the true potential analog really has
@timidpeter Yıl önce
Advantages of digital are exact values for situations where precision is necessary and the possibility to compress data to essential values, thus to allow deliverance of rather 'targeted' information. After all I prefer analogue computing for being more suitable to solve contemporary issues calculating many parameters. Even though many 'digital jobs' won't be available anymore.
@nachimanicharde4967 8 aylar önce
This a wise analysis. Analog systems could be developed for passive functions like traffic lights, animal instructors, mall lighting systems, building automation system, flood monitor, disaster monitors, etc etc etc...
@RefrigeratedTP 8 aylar önce
The company that makes the graphics card he's holding is eventually going to be his biggest competitor. NVidia has already entered the AI chip space, but with "digital chips", not "analog chips". Great video. That very smart guy has a huge amount of pressure ahead of him.
@kabatsky 9 aylar önce
Thank you for such a great video. As a sidenote: I still don't think we need machines that think like us, we're clearly not the brightest creatures in this universe, that's all I'm saying 😅
@jeffc5974 Yıl önce
One of the first things I learned in Electrical Engineering is that transistors are analog. We force them to behave digitally by the way we arrange them to interact with each other. I'm glad there are some people out there that remember that lesson and are bringing back the analog nature of the transistor in the name of efficiency and speed.
@jasonbarron6164 Yıl önce
At the expense of accuracy?
@JKPhotoNZ Yıl önce
Well, semi analogue. Don't forget the bias (voltage drop) before you get current amplification. Also, to say that analogue computers are more power efficient that digital is pretty hard to back up. A $2 microcontroller can run on a few mA for the desired task, then sleep on uA. You'll need at least 5mA for an analogue computer to start with and you can't make it sleep.
@danimayb Yıl önce
@@JKPhotoNZ Great point. And with current Nano transistor technology, That efficiency (along with raw power) is going far beyond what a true analogue system could produce.
@rahulseth7485 Yıl önce
Yeah but then you'll never know at which zone is it on? Because amplification happens differently for different input parameters. And not all transistors from the same batch would perform the same, i.e. it will lack repeatability (as Derek mentioned).
@mycosys Yıl önce
the insoluable (even in theory) problems of analog are noise and signal integrity, which is why he didnt even mention them. This channel has gone to poop honestly.
@chrislong3938 Yıl önce
When I was in the Army in the '70s, I was in a counter-battery radar unit (MOS 17B). Our radar was an AN-MPQ/4 and it was governed by an analog computer to direct artillery fire. It was very good at its job but if you looked at it wrong, it would go down. Almost all of its downtime though was caused by problems with the radar transceiver parts rather than the computing part. which was completely gear-driven and very robust like an old NCR cash register (amazing machines if you've never seen the innards of one!). Incidentally, one of my Chief Warrant Officers (CW4) was involved in its design and boy was he smart! He'd been in the Army since the Korean War and was an old fart when I knew him.
@yewenyi 6 aylar önce
Many many years ago I learned to program using analog computers. Then while I was still in engineering the digital computers became a thing and by my fourth year we were doing only digital computers.
@madmotorcyclist 5 aylar önce
I remember Lisp computers from the 80s and early 90s. How they handled memory was so different from what we used today within the confines of the Lisp language (Java basically took the garbage collecting technology from Lisp). I miss the flexibility of that language which memory wise was more efficient than standard languages (4MB Lisp core took 10MB in C++ to replicate) and coding wise as well, but not as fast as standard compile link languages being an incremental compiler based language. Lisp was the language progenitor for AI.
@Freelix2000 Yıl önce
This video is awesome, and I learned a lot from watching it, but one thing really confused me at the end. Towards the end, there was an implication that an analog neural network would have to be physically present in any device in which it is used. But neural networks aren't computationally expensive to use (to compute an output given an input), they're only computationally expensive to train. Is it possible to train a neural network on analog and then export the network weights onto a digital system? It seems like that would be the preference. It doesn't even have to be an easy process. Google can use analog microchips to train parts of their search algorithm and retrain as often as they're able, and as long as they can export the weights back to digital, no analog needs to be involved when I Google "what the heck is neural network". I can see how one part of the problem could be extracting the values stored in each weight (measuring the conductivity of particular repurposed storage gates), but in that case, you could focus on making the chip more modular for any manual steps needed to read the values instead of focusing on making it more compact and perfectly efficient. Even if you have to train the model, test it, and then tear it apart to get the weights, it would still probably be worth it for the portability of the end result.
@guppy277 Yıl önce
"Now the size of a transistor is approaching the size of an atom. so there are some fundamental physical challenges to further miniaturisation" What an articulation ...! 👌👌 I am no scientist, nor an electronics engineer. Yet i entirely understand and relate to what this is ...! Feels great to be living in these times of TRvid with the likes of Derek, Tim Dodd, Tim Ellis etc guys ...
As a 70-year-old boomer my technical education involved building and testing very basic analog devices. Thanks for this video, it helped me to a better understanding of neural networks.
@magma5267 Yıl önce
You must be really healthy because you dont even look close to 70! :D
@@magma5267 Thanks for the heads up. I've got a good women, 6 children, 8 grandchildren and a recently placed stent in my heart that keeps me going :)
@reddyanna2609 Yıl önce
@@TerryMurrayTalks Good for you my man! I am still 20 and don't know what to do in life :(
@vource2670 Yıl önce
Yep your 70 mate
@@reddyanna2609 You have plenty of road left to travel, follow what you love, enjoy the journey, don't let the bumps in the road stop you and if you can get a soul mate to share it with you it will all be good.
@mos2008 5 aylar önce
I don't know why, but the analog computers (like Comdyna GP6) look really futuristic to me. I personally feared of AI, so I would use it for calculations that do not require preciseness.
@Psrj-ad 9 aylar önce
this make me want Derek to talk about Neural networks and AI related topics a lot more. its not just extremely interesting but also constantly developing.
@yourright4510 6 aylar önce
While it may be true that we are reaching a limit we’re not quite certain what computational power some new neural networks will need. This is for future applicable outputs needed. Hinting at the new analog data calculations coming into the forefront.
@derpnerpwerp 2 aylar önce
Ive been working in the field of machine learning for several years now and Ive always said analog computers would be much more efficient... if you could account for the loss of precision and error issues
@funktorial Yıl önce
started watching this channel when I started high school and now that I'm about to get a phd in mathematical logic, I've grown an even deeper appreciation for the way this channel covers advanced topics. not dumbed down, just clear and accessible. great stuff! (and this totally nerd-sniped me because i've been browsing a few papers on theory of analog computation)
@gautambidari Yıl önce
Absolutely. Love the way he covers the concept for everyone. Those who don't know in depth about it can still go away with a sort of basic understanding. And those who do understand it in depth will enjoy discovering new areas of invention that they can further explore. Looking forward to reading some papers on using analog computing in neural network applications
I'm smart too.
@mentaltfladdrig Yıl önce
Same here. But i didnt go to high school and mylife became a total mess and i havent graduated whatsoever :)
@SteveAcomb Yıl önce
“nerd-sniped” lmao I feel exactly the same and here I was thinking I was way ahead of the curve on alternate computing 😂 jokes on me
@occasionalshredder 6 aylar önce
What if memory chips were spread out across the card, large amounts of small chips and an ability to access and use them depending on which is closest, a web for them to communicate with eachother and swap information in a rising chain, could be an interesting way to work over certain current modern limitations
@kasparsiricenko2240 8 aylar önce
When I was in institute back in 2016 I was thinking of this specific “gates” as well as undergraduate. I knew someone was already implementing it but still missing the time I could be part of innovations. What a genius way of reimplementing circuits for neural networks. Maybe that’s what the FPGA is future is - neural networks
@SaunaShruti 5 aylar önce
now Ilya sutskever has made such a huge comeback by being the mind behind Chat gpt 4. he was also among the members who published papers and were working for Alexnet. he is a pure hardwork.
These seem more like analog processes built inside digital hardware, though the model of using the bias voltages of the memory cells could be applied to PGA chips. For an example of analog processes running in analog hardware, something like the task specific "Lean Burn Computer" built by Chrysler in the '70's seems closer to some of the descriptions - it processed engine speed, load, heat and throttle movement over time to determine an ignition advance setting that would use less fuel, though it did require a driver to lift the foot off the pedal slightly as the advancing ignition curve would cause the car to surge a little, something cruise control could adjust for.
@YolandaEzeagwu 9 aylar önce
I had a business analysis course that tried explain the perceptron and I didn't understand anything. I don't have a strong maths background. This video is pure genius. The way you explain ideas is amazing and easy to understand. Thank you so much. This is my favourite channel
@FragEightyfive Yıl önce
Using an analog computer to demonstrate differential equations is a perfect teaching tool. I really wish tools like this were used in colleges more often.
@Elrog3 Yıl önce
They already use crap like this far too often. This isn't something use for for a differential equations course. Maybe it would be ok for a circuits course or even Computer Science course focused solely on analog computers. In math, just give us the numbers and the logic... Don't waste are time with this stuff.
@Elrog3 Yıl önce
@@JackFalltrades I am an engineering student.
@Elrog3 Yıl önce
@@JackFalltrades I'm not calling letting students know of use-cases for things crap. I'm calling taking up class time that is meant for teaching students the logic of how to solve differential equations (because that is the class the original poster said it would be good for) and instead using that class time to teach something that only a tiny fraction of the class would ever use.
@quotidian8720 Yıl önce
it is used in control systems
@Noootch Yıl önce
@@Elrog3 He never said it should be used in a differential equations course. You just sound like the type of students that go to university and ask which courses they need in order to get a high salary position in industry.
@0raj0 19 gün önce
It would be a nice touch if - when demonstrating an analog computer - you would connect its output to an actual, classic analog oscilloscope, instead of a digital one, that in fact simulates operation of an actual oscilloscope :)
@Krischi6 11 aylar önce
Apprechiate the animations! I am not that good at dry theory, so this helps me a lot to understand things
@lucabaar1 Yıl önce
black & white (binary) VS indefinite respective relations (analogue). The idea is how to discern / correlate in a way that holds computational integrity in an extensive system. The most fascinating thing about analogue computing is the fluid ambiguity of it compared to traditional binary computing, quantities of series' of 1's & 0's VS breadths of expressive potentials via relativistic determination of the relevant inputs. Such a superior application of variable expression for representing complex phenomena as opposed to binary computation.
Great video. Seems the future may be a hybrid solution involving both analog and digital processing working in harmony.
@suivzmoi Yıl önce
as a NAND flash engineer that bit about the usage of floating gate transistors as analog computers is interesting. particular because in flash memory there is a thing known as "read disturb", where even low voltages applied to the gate (like during a flash read) to query it's state can eventually cause the state itself to change. you would think it is a binary effect where if it's low enough it would just be a harmless read but no...eventually there will be electron build up in the gate (reading it many times at low voltage has a similar effect to programming it one time at a high voltage). in this particular application the weight would increase over time the more you query it, even though you didn't program it to be that high in the beginning. it's interesting because, it's sort of analogous to false memories in our brains where the more we recall a particular memory, the more inaccurate it could potentially become.
@donkisiko Yıl önce
Underrated comment!
@Xavar1us Yıl önce
Absolutely love this comment! This has been on my mind for at least an hour now, the point you make is intriguing and a bit haunting, thanks for that!
@JeyeNooks Yıl önce
Fkin right on!!
@Lassana_sari Yıl önce
Very interesting.
@sampathsris Yıl önce
Underrated comment. Then in Eternals style we will have to reprogram the memories of our servants every now and then.
@MynameisBrianZX Yıl önce
Would it be more accurate to say that analog computation is finding an important new application (neural networks)? Because my naive thought is that sensors and analog signal processing circuits count as very common analog computers. Or is there an important factor that defines “computation”?
@Neuro537 11 aylar önce
Analog computing is quite possible, just change the voltage or amperage and have a sensor that can read both and you have 2 values travelling on the same wire
It just hit me. True AI will come when a analog computer is responsible for determining each 0 or 1 in a digital processing computer.
@willembielefeld712 6 aylar önce
The question still remains: Are the neurons in our brain really the top template? The idea of analog computing is also one part of quantum cpus. Instead of 0 or 1 you want to achieve states for each information. But in the end, it will be a matter of software architecture and i think this might be a very difficult one. Maybe technically, it works better, but creating software for it can be quite difficult. It is like re inventing the computer from scratch. If you have a cpu which has analog states, these states must be interpreted correctly by the software and programing language. But well, i guess it could be done. Maybe with the help of the computers we now have, it might be possible to generate a programing language for an analog cpu. This will be far beyond my own it skills :D
@Nathan-vt1jz 4 aylar önce
I’m most excited to see the combination of analog and digital in one system.