During my teenage and college years in the 2000s, I was inspired by what I’ve read about Bell Labs, and I wanted to work as a computer science researcher in industry. I’ve also been inspired by Xerox PARC’s 1970s and 1980s researchers. I pursued that goal, and I’ve worked for a few industrial research labs before I switched careers to full-time community college teaching a few months ago.
One thing I lament is the decline of long-term, unfettered research across the industry. I’ve witnessed more companies switching to research management models where management exerts more control over the research directions of their employees, where research directions can abruptly change due to management decisions, and where there is an increased focus on profitability. I feel this short-term approach will cost society in the long term, since current funding models promote evolutionary work rather than riskier, potentially revolutionary work.
As someone who wanted to become a researcher out of curiosity and exploration, I feel alienated in this world where industry researchers are harangued about “delivering value,” and where academic researchers are pressured to raise grant money and to publish. I quit and switched to a full teaching career at a community college. I enjoy teaching, and while I miss the day-to-day lifestyle of research, I still plan to do research during my summer and winter breaks out of curiosity and not for career advancement.
It would be great if there were more opportunities for researchers to pursue their interests. Sadly, though, barring a cultural change, the only avenues I see for curiosity-driven researchers are becoming independently wealthy, living like a monk, or finding a job with ample free time. I’m fortunate to have the latter situation where I have 16 weeks per year that I could devote outside my job.
Back in 1963, I had the privilege of participating in a program the summer before my senior year of high school. This was at the research lab of GE in Schenectady. I was working (not very productively) on a project involving platinum catalyst of hydrocarbons – what eventually became catalytic converters. The kid who tutored me in calculus (and later won a MacArthur grant and founded his own think tank) worked on what was then called nuclear magnetic resonance or NMR. Now it’s the guts or MRI machines (they left off the “nuclear” for PR reasons. I wonder how many kids these days have the chance to do shit like that, and if there are any labs where long-term research like that is funded.
> the only avenues I see for curiosity-driven researchers are becoming independently wealthy, living like a monk
I came to the same conclusion. This is the path I'm following (trying to set up a company and lean FIRE). It's sad in a way because those efforts and years could have been directed to research but we have to adapt.
That was mostly how the big scientific breakthroughs came in the 1800-1900s .. independent wealth.
That’s what a “scholar” is and Universities provided the perfect environment for that to thrive, which is no longer the case.
In the 1800’s and early 1900’s, maybe…
In post WW2 America though there was increased funding from the state, large research universities, institutes and national labs could be created. In the era where all that was really working at full speed, the “big scientific breakthrough” came at such a pace that it became hard to see what was big or not.
Edison's research lab was funded by companies wanting specific inventions developed.
I think that this was Paul Graham's original ambition for YC, really: a hope that some at least of the successful founders would choose to take their winnings and implement the next Lisp Machine and similar projects. Unfortunately, as with other things, winning the SV VC game just seems to incline people to either keep climbing that same greasy pole, or to do unstrenuous rich-guy things, or some combination of those two.
I've seen that many times over by now, sort of done it myself. It doesn't really work. You end up replacing one problem for another. There is also a heavy dose of procrastination and escapism related to it. Think about how many could, and does, do it and but how few results there are.
All it took was one bored patent clerk spending idle time thinking about something that he couldn't just let go and now we have General Relativity and black holes.
I was going to make a snarky remark like “Researchers are just going to have to write on Substack”.
Then I read this: http://mmcthrow-musings.blogspot.com/2020/10/interesting-opi...
I think the alt-economy that you describe may turn up soon. at least the one that I’m imagining that doesn’t involve registering for Substack.
> Our economy promotes short-term gains and not long-term initiatives; I blame over 30 years of artificially-low interest rates for this.
Don't low interest rates promote long-term thinking, perhaps to an absurd degree (e.g. the "it's okay that we hemorrhage money price dumping for 10 years as long as we develop a monopoly" playbook)? Bigger interest rate = bigger discount for present value of a future reward.
I'm guessing that they're referring to the practice of investors borrowing with low interest rates to buy large amounts of stock in a company, then milking the company of all its value for short term gains. Unless there's a significant plan in place, many companies can't really get away with long term playbooks if they are responsible to shareholders using ownership for short term gains (so they can quickly move to the next money making asset).
The truth is that kind of research can only happen with a very rich monopoly.
Bell labs came about when AT&T was the monopoly telephone provider in the US.
PARC happened when Xerox had a very lucrative monopoly on copy machines.
I have come to realize that over the years, though I still believe that wealthier companies like Apple, NVIDIA, Facebook, and the like could fund curiosity-driven research, even if it’s not at the scale of Bell Labs or Xerox PARC.
On a smaller scale, there is The Institute for Advanced Study where curiosity-driven research is encouraged, and there is the MacArthur Fellowship where fellows are granted $150,000 annual stipends for five years for them to pursue their visions with no strings attached. Other than these, though, I’m unaware of any other institutions or grants that truly promote curiosity-driven research.
I’ve resigned myself to the situation and have thus switched careers to teaching, where at least I have 4 months of the year “off the clock” instead of the standard 3-4 weeks of PTO most companies give in America.
If Zuck's obsession with VR isn't curiosity driven research than nothing is.
10 billion yearly losses for something that by all accounts isn't close to magically becoming profitable. It honestly just seems like something he thinks is cool and therefore dumps money in.
Maybe it's rare to do curiosity driven research.
But from the days of Bell Labs, haven't we greatly improved our ability to connect between some research concept to the idea of doing something useful, somewhere ?
And once you have that you can be connected to grants or some pre-VC funding, which might suffice, given the tools we have for conceptual development of preliminary ideas(simulation, for ex.) is far better than what they had at Bell?
I thought I had read somewhere that 2 weeks vacation is more common in USA, at least for software companies, before things like "unlimited vacation". which is right, 3-4 or 2 weeks?
What makes you think they don't fund it ?
This is not the mindset of monopolies, cutting research for the sake of short term profits is the mindset of Wallstreet's modern monopoly.
I agree with you that the modern corporate world seems to be allergic to anything that doesn't promise immediate profits. It takes more than a monopoly to have something like Bell Labs. To be more precise, monopolies tend to have the resources to create Bell Labs-style research labs, but it also takes another type of driving factor to create such a research lab, whether it is pleasing government regulators (I believe this is what motivated the founding of Bell Labs), staying ahead of existential threats (a major theme of 1970's-era Xerox PARC was the idea of a "paperless office," as Xerox saw the paperless office as an existential threat to their photocopier monopoly), or for purely giving back to society.
In short, Bell Labs-style institutions not only require consistent sources of funding that only monopolies can commit to, but they also require stakeholders to believe that funding such institutions is beneficial. We don't have those stakeholders today, though.
That's my conclusion as well since now the closest we have to Bell Labs is the Google R&D where it has a virtual monopoly on Internet search and it's able to hire excellent well paid researchers [1].
[1] US weighs Google break-up in landmark antitrust case:
Bell Labs was also funded by a massive monopoly.
>The truth is that kind of research can only happen with a very rich monopoly.
A classification which includes government funding, note.
Okay, I'm really in a sad mood, so: tell me there will be places like that, again, somewhere, ever ?
We need this. Like, really, we need someone to have created the xerox part of the 21st century, somewhere about 20 years ago.
I honestly though Google would be that - but apparently it's easier to fund R&D on "selling copying machines" than "selling ads". Maybe "selling ads" earn _too much_ money ? I don't know.
I know, I know, DeepMind and OpenAI and xAI are supposed to fix climate change any time soon, and cure cancer while they invent cold fusion etc, etc... and it's only because I'm a pessimistic myopist that I can only see them writing fake essays and generating spam, bad me.
Still. Assuming I'm really grumpy and want to talk about people doing research that affects the physical world in positive way - who's doing that on the scale of PARC or Bell ?
The secret hero of that time was the US government. I’m not talking about the MIC, which is still quite robust and more bad than good. I am speaking more broadly. If you had a practical PhD and were willing to show up at a place at 9:00, you could get a solid upper middle class job with the Feds where you couldn’t get fired unless you broke the law.
The government also has always kept academia afloat. It is a privilege afforded to professors to believe they do not work for the state, but they do.
Great government and academic jobs forced companies to create these labs where it was better to hire great people and “lose” some hours to them doing whatever they want (which was still often profitable enough) than have zero great people. Can you imagine Claude Shannon putting up with the stuff software engineers deal with today?
The other main change is that how to run big companies has been figured out well enough that “zero great people” is no longer a real survival issue for companies. In the 1970s you needed a research level of talent but most companies today don’t.
Something that just dawned on me is the downstream effects of United States’ policy regarding science during WWII and the Cold War. The Manhattan Project, NASA, the NSA and all of its contributions to mathematics and cryptography, ARPA, DARPA, and many other agencies and programs not only directly contributed to science, but they also helped form a scientific culture that affected not only government-ran and government-funded labs, but also private-sector labs, as people and ideas were exchanged throughout the years. It is a well-documented fact that Xerox PARC’s 1970’s culture was heavily influenced by ARPA’s 1960’s culture.
One of the things that has changed since the 1990s is the ending of the Cold War. The federal government still has national laboratories, DARPA, NASA, the NSF, etc. However, the general culture has changed. It’s not that technology isn’t revered; far from it. It’s just that “stopping Hitler,” “beating the Soviets,” and grand visions for society have been replaced with visions of creating lucrative businesses. I don’t hear about the Oppenheimers and von Neumanns of today’s world, but I hear plenty about Elon Musk and Sam Altman, not to disrespect what they have done (especially with the adoption of EVs and generative AI, respectively), but the latter names are successful businessmen, while the former names are successful scientists.
I don’t know what government labs are like, but I know that academia these days have high publication and fundraising pressures that inhibit curiosity-driven research, and I also know that industry these days is beholden to short-term results and pleasing shareholders, sometimes at the expense of the long-term and of society at large.
I don’t hear about the Oppenheimers and von Neumanns of today’s world
Sadder still is the underlying situaiton behind this: the fact that there's nothing of even remotely comparable significance happening in the public sphere for such minds to devote themselves to, as those man did. Even though the current civilization risk if anything significantly greater than in their time.
> It’s not that technology isn’t revered; far from it. It’s just that “stopping Hitler,” “beating the Soviets,” and grand visions for society have been replaced with visions of creating lucrative businesses.
Any kind of societal grand vision we had has been falling apart since about 1991. Slowly at first (all the talk about what to do with the "peace dividend" we were going to get after the fall of the Soviet Union) And that accelerated with the advent of the internet and then accelerated even more when social media came on the scene. We no longer have any kind of cohesive vision for what the future should look like and I don't see one emerging any time soon. We can't even agree on what's true anymore.
> I don’t know what government labs are like
Many of these are going to be in danger in the next administration especially if the DOGE guys get their way.
> successful businessmen, while the former names are successful scientists
We’ve seen this before with Thomas Edison.
>It’s not that technology isn’t revered; far from it. It’s just that “stopping Hitler,” “beating the Soviets,” and grand visions for society have been replaced with visions of creating lucrative businesses
Universities are tripping over themselves to create commercialization departments and every other faculty member in departments that can make money (like CS) has a private company on the side. Weird that when these things hit, though, the money never comes back to the schools
Yup. Silicon Valley would not exist without large government spending.
You can bet this spending is going to be among the fist things slashed by DOGE-lile efforts ("Scientists ? They're just liberal elites wasting our hard earned money researching vaccines that will change your dog's gender in order to feed it to communist immigrants.")
I suppose I could be cheered up by the irony, but, not today.
> honestly though Google would be that - but apparently it's easier to fund R&D on "selling copying machines" than "selling ads". Maybe "selling ads" earn _too much_ money ? I don't know.
I'm pretty sure Google Brain was exactly what you are looking for: People like to think of DeepMind, but honestly, Brain pretty much had Bell Labs/PARCs strategy: they hired a bunch of brilliant people and told them to just "research whatever is you think is cool". And think all the AI innovations that came out of Brain and were given to the world for free: Transformers, Vision Transformers, Diffusion Models, BERT (I'd consider that the first public LLM), Adam, and a gazillion of other cool stuff I can't think of right now.... Essentially, all of the current AI/LLM craze started at Brain.
Yes, it was basic research( guided to the field of machine learning), but between a search monopoly and their autonomous car project, they definitely have a great economic engine to use that basic research and the talent it pulled into Google, even if a lot of it escaped.
Right. And I'm sure that if I ever get in a better mood, I'll find that the current AI/LLM craze is good for _something_.
Right now the world needs GWh batteries made of salt, cheap fusion from trash, telepathy, a cure for cancer and a vaccine for the common cold - but in the meantime, advertisers can generate photos for their ads, which is, _good_, I guess ?
Your problem stems from assuming our natural state is some Star Trek utopia, and only our distraction by paraphernalia is preventing us from reaching such a place. Like we are temporarily (temporally?) embarrassed ascended beings.
Humanity’s natural state is abject poverty and strife. Look at any wealth graph of human history and note how people are destitute right up until the Industrial Revolution, and then the graph explodes upward.
In a way we (well, especially the West) are already living in utopia. You’re completely right that we can still vastly improve, but look back at the progress we already made!
It does sound like you're in a particularly bad mood, so yes, maybe our outlook does change. Maybe it helps to think of a darker timeline where Google would have kept all of these advances to itself and improved its ad revenue. Instead it shared the research freely with the world. And call me naive, but I use LLMs almost daily, so there definitely _is_ something of value that came out of all this progress. But YMMV, of course.
Can't you get telepathy from training AI on functional MRI data? And then finding a way to pinpoint and activate brain regions remotely?
I mean brain-machine interfaces have been improving for quite a while.
Telepathy might even already exist.
Rolling back the 1980s neoliberal cultural ideals of letting markets and profits be the highest arbiter of societal direction is the key.
Silicon Valley hippies have been replaced by folks focussed on monetisation and growth.
It’s not great for the west, but those problems are being tackled. We just don’t get to read about it because ‘China bad’ and the fear of what capital flight might do to arguably inflated US stock prices
https://www.energy-storage.news/byd-launches-sodium-ion-grid...
Extreme ultraviolet lithography originated with paper out of Bell Labs in 1991, then US government funded multiple research efforts via national nuclear research labs that came up with a potential method to implement but it took 20 more years of trial and error by ASML to make a practical machine they could sell. Other companies tried and gave up because of the technical challenges. This advance is responsible for modern chip fabs fastest chips.
No, but you also shouldn't romanticize Bell Labs _too_ much. It was not exactly a fun place to work. You got a 2 year postdoc and then you were out, and those two years were absolutely brutal. Its existence was effectively an accounting fluke. Nothing like it really exists now because it would largely be seen as an inefficiency. Blame Jack Welch, McKinsey, KKR, HBS or whoever you like.
I was a postdoc there and I would not say it was brutal. I got a very good salary (far above an academic postdoc), health benefits, relocation, the ok to spend $1000/day on equipment with no managerial review[0], and access to anyone and everyone to whom I felt like speaking. I read horror stories in Science and other journals about people's experiences elsewhere and am grateful that I was spared so much nonsense. It was the greatest university I have ever set foot in. I still feel unworthy of the place.
[0]This was in the early '90s when $1000 went a long way.
> It was not exactly a fun place to work.
I couldn't disagree more, but perhaps the time I was there (late 90s) was different.
Just curious, are you speaking from personal experience?
Everyone wants Bell Labs, but not the thing that made it possible — high corporate profit taxes. They were making bucket loads of monopoly money and had to put it somewhere or taxes would it away.
I think a lot of people want high corporate profit taxes
people own corporations. If you tax the income of people who own corporations, it's not necessarily necessary to tax corporations; just pass their profits through to the people and tax the people. There are good reasons not to double-tax, first at the corporate level and then again at the personal level.
This is similar to the perverse incentive of share buybacks. It's just more tax efficient to pump stock value than pay proper dividends.
Corporations are apparently people too! They should be taxed like any other flesh and blood human!
I really want gov around the world to take back governance to be solely for the benefit of people. None of this greedy corruption lobbyist stuff.
How else will we know if seatbelts are effective in Africa?
Congrats, you've been brainwashed by billionaires into voting against your interest.
Yes, disincentivize dividends and buybacks, reincentivize investments, R&D and others.
That makes no sense. Dividents and buybacks are both taxed.
From what I understand, they weren't really allowed to sell anything not related to telephony as well.
So they could create Unix, but they weren't allowed to profit off of it. So they just gave it away, because why not.
Just reading the book The Idea Factory, it was incredible amount of innovation. Lasers, early satellites, transistors.
And it was all done, apparently, at least in the beginning, because they hired smart people and they let them do what they wanted.
Almost all of the things I can think of that came from bell labs are things that helped their business. The only thing that I don't know how it helped their business was Hemo the Magnificent and similar films; but I'm sure those helped with PR.
Monopoly may have helped them pay for such r&d, but vertical integration is what made it possible for so much r&d to be relevant to the business.
I think it ended up helping their business because they deliberately made it so later, but it’s not clear how some inventions would help their business in advance. From the book there was quite a few things that sat on the shelf for years until someone figured out a way to use them later.
All true, but monopoly profits sure help.
As do high corporate tax rates
I know the author, Jon. Delightful guy
unix, c and c++ too.
And S, the statistical data language that was the ancestor of S-PLUS and R.
did not know about that, thank you.
Haven't gotten to that part of the book.
They don't cover it in the book, unfortunately. A serious omission, in my eyes.
Pasting my comment on the article https://www.construction-physics.com/p/what-would-it-take-to... :
> RCA Laboratories/the Sarnoff Research Center is surely one of the most important of the American corporate labs with similarities to Bell Labs. (It features prominently in Bob Johnstone's We Were Burning https://www.hachettebookgroup.com/titles/bob-johnstone/we-we... : it has a big role in the history of the Japanese semiconductor industry, in large part because of its roles in the development of the transistor and the LCD and its thirst for patent-licensing money.)
>> In Dealers of Lightning, Michael Hiltzik argues that by the 1990s PARC was no longer engaged in such unrestricted research decoupled from product development.
> According to Hiltzik and most other sources, the PARC Computer Science Lab's salad days were over as early as 1983, when Bob Taylor was forced to leave, while the work of the other PARC labs focussed on physics and materials science wasn't as notable in the period up to then.
Seriously: if this kind of thing interests you at all, go and read We Were Burning.
> and the other co-inventor of the integrated circuit was Fairchild Semiconductor, which as far as I can tell didn’t operate anything like a basic research lab.
Kind of a strange statement. Fairchild took the "traitorous eight" from Shockley Semiconductor, which was founded by William Shockley, who famously co-invented the transistor at Bell Labs (and who named the "traitorous eight" as such.)
So while Fairchild "didn’t operate anything like a basic research lab", its co-invention of the IC was not unrelated to having a large amount of DNA from Bell Labs.
Bell Labs is wonderful to read about, and I've really loved delving into it. Alan Kay's talks in particular.
However, it should be seen as a starting point! Alternative hypothetical pasts and futures abound. One issue is that the stuff from the past always looks more legendary seen through the lens of nostalgia; it's much harder to look at the stuff around you and to go through the effort of really imagining the thing existing.
So that's my hypothesis - there isn't a smaller volume of interesting stuff going on, but viewing it with hope and curiosity might be a tad harder now, when everyone is so "worldy" (i.e., jaded and pessimistic).
Proof:
https://worrydream.com/ (brett victor)
and the other people doing dynamicland and realtalk, both discussed straightforwardly here:
https://dynamicland.org/2024/FAQ/
https://solidproject.org/about -- solid, tim berners-lee and co, also.
https://malleable.systems/catalog/ -- a great many of the projects here are in the same spirit, to me, as well!
https://spritely.institute/ -- spritely, too
https://duskos.org/ -- duskOS, from Virgil Dupras
https://100r.co/site/uxn.html -- 100 rabbits, uxn, vibrating with new ideas and aesthetics
https://qutech.nl/ -- quantum research institute in the netherlands, they recently established a network link for the first time I believe
etc etc. These are off the top of my head, and I'm fairly new to the whole space!
The research done at Bell Labs is the foundation of the information age, however, Bell Labs sowed the seeds that made the post-divestiture AT&T a doomed enterprise from the start - there is a reason they only lasted ~20 years from divestiture 'til they were bought by one of their former children, SBC.
AT&T provided for most of its history, the best quality telephone service in the world, at a comparable price to anyone else, anywhere.
There were structural issues with the AT&T monopoly however, for example cross subsidization - the true cost of services was often hidden because they would use optional services (like toll calling) to subsidize basic access, and business lines would cross subsidize residential service.
The level that AT&T fought foreign connections (aka, bring your own phone), probably hastened their demise, in the end, the very technologies that AT&T introduced would turn long distance from a high margin, to low margin business - the brass at AT&T had to know that, but they still pinned the future of their manufacturing business on that - a manufacturing business that had never had to work in a competitive environment, yet was now expected to - because of this and other factors divestiture was doomed to failure.
I'm a believer in utilities being a natural monopoly, but AT&T was an example of effective regulatory capture, it did not, and does not have to be this way, however it was.
Eventually the cable guys were coming for AT&T's lunch, regardless of what happened with their monopoly. It's the rare circumstance where two seemingly unrelated utilities converged into the same business (moving bits, instead of analog video or audio) and we lucked into having two internet facilities in large portions of the country
Local Access is a very different issue, and I dont really disagree - but the local copper loop being broadband is an accident and one of technological evolution.
When the decisions were made about divesiture, that bit was non obvious.
Get a daily email with the the top stories from Hacker News. No spam, unsubscribe at any time.