Hacker News
3 years ago by qbasic_forever

I get more and more excited the closer we get to an eventual reality that a kid learning about programming, operating systems, Linux, etc. is just a click on their phone away from starting it all. No gatekeepers--appstores, sideloading, hacks, jailbreaks, locked down BIOS, etc.--to hold them back. Your browser, a shell, and the entire world in front of you.

3 years ago by mechEpleb

I get the feeling that the world was significantly more open to kids growing up with access to computers in the 80s and 90s, because as someone whose formative years were the 00s and early 10s, the interesting bits of technology were already buried under a thousand layers of abstraction and indirection. Kids nowadays won't ever learn what a shell is unless they go out of their way to learn it for some reason.

3 years ago by todd8

My daughter finished her CS degree about a year ago. I was glad to see that one required class had her build a very simple computer from gates as her final project. I don’t know the details, but I think the computer had perhaps 10 different instructions. She didn’t like that project very much, but I thought it was valuable; someone has got to understand the principles of operation at that level.

I started programming in the 60’s and at times had to load instructions into machines using binary on front panel switches. That’s the reasons that machines of that time had the lights and rows of switches on the front so that one could debug programs by looking at the lights to see the program counter, data value, or instruction. See [1] for a photo of a large front panel on an iconic machine of the time.

I even recall pulling plug panels out of card processing equipment to reprogram the sorting and selecting of input data being run through machine as huge stacks of punch cards, see [2] for a picture of a mid-20th century plug panel for data-processing.

The layers of abstraction are important. They enable us to construct some of the most useful, complex, and intricate artifacts ever made on our planet. Today I program in high level languages, and I get to use powerful frameworks, database systems, and amazing hardware right on my desk. Yet, I do miss some of the fun of invention and hacking on systems that I really understood in depth.

[1] https://en.m.wikipedia.org/wiki/Front_panel#/media/File%3A36...

[2] https://www.ebay.com/itm/133017817600?hash=item1ef87ad200:g:...

3 years ago by ksec

>My daughter finished her CS degree about a year ago.

Interesting I thought CS was all software where computer with gate and low level programming were something of EE / Computer Engineering.

3 years ago by dimal

Sorry, you would be wrong. There was no internet. You were lucky if there was one other person around you who knew BASIC and could answer questions. Everything you needed to learn came in books that cost $30 each (at a time when $30 was a lot of money), and most of the books at the computer store were for using spreadsheets and word processors, not actually coding. Great, you had a DOS shell right in front of you, but learning what to do with it was a struggle. And writing .bat files isn’t very exciting. Maybe if you were in a place like SV, there would be tons of resources to learn, but in most other places you would be in an information desert. I was interested in computers but gave up because I hit the limit of what I could learn pretty quickly and couldn’t get any further. Kids have it MUCH better today.

3 years ago by sensanaty

That's really not true though, lots of dev-focused tools explicitly require hopping into a terminal and typing away commands. Plus at a certain point, anyone with any real curiosity is going to think "I wonder what's hidden behind these abstractions I see all the time?" and dig deeper anyways

3 years ago by mechEpleb

You'd think that, wouldn't you? Yet here I am, someone who grew up thinking computers were obtuse and boring because getting the computer to do anything interesting seemed like it would require knowing a thousand things not related to the issue at hand. I was always a mechanically minded person, so while the inner workings of things seemed interesting, making toy websites (the entry level computer thing to do in that time period) seemed about as interesting as watching paint dry.

But here I am, working as a software engineer and half way through my MSc in computer science. It took a couple of low level microcontroller classes in my mechanical engineering undergrad for me to see the light.

3 years ago by bicolao

It's still a lot harder (abstraction layers add complexity) to get close to metal as opposed to say, DOS.

3 years ago by DaiPlusPlus

You’re still stuck in the browser sandbox (I prefer to think of it as a Browser-Plato’s-Cave in the context of whole nested systems). And you can’t escape the rectangular box imposed on you by the browser. …especially on mobile OSes: there is essentially zero system-integration by going that route because Apple and Google both need incentives to entice people into forking over that lovely 15-30%.

Things like Push notifications. Background activity. Guarantees about data persistence. First-class Home Screen app icon. Ability to directly share data with other native applications even if they don’t want to (to the extent it’s enabled by the platform’s native Share Activity). And to a lesser-extent: the ability to use native widgets for the best user-experience on that plstform. Too many SPAs fall into uncanny-valley when they start to look too similar to native widgets - and it’s off-putting.

And the fact that Fortnite still isn’t available as a PWA on iOS is telling… I was expecting them to launch an OnLive-like service rendered to a <canvas> over WebRTC by now… and worryingly this gives Apple a strong incentive to immediately halt any work on improving WebRTC in PWAs…

3 years ago by outofpaper

> I was expecting them to launch an OnLive-like service rendered to a <canvas> over WebRTC by now… and worryingly this gives Apple a strong incentive to immediately halt any work on improving WebRTC in PWAs

Like Stadia? (https://youtu.be/3_RAyxpFurU?t=113)

3 years ago by DaiPlusPlus

Exactly the same, yeah.

3 years ago by croes

Are we getting closer or is it the opposite?

3 years ago by qbasic_forever

Very close, almost all of POSIX is implemented in browsers now. Look at for example JSLinux https://bellard.org/jslinux/ for the classic example, or more recently stuff like pyodide that compile real desktop Python to run fully in the browser with WASM https://github.com/pyodide/pyodide

AFAIK there are a few loose ends still TBD with WASM & WASI to support sockets and networking, but once that's in place we'll likely have a full WASM POSIX environment in your browser. Get some of the core tools like gcc, etc. prebuilt and you're good to go to just start building the world in your browser. No app store reviewer to hold you back, no megacorp to decide homebrew apps aren't allowed anymore... the world is your oyster to create and share anything.

3 years ago by dmitriid

> almost all of POSIX is implemented in browsers now.

POSIX is not implemented anywhere. There are degrees to which it's implemented in various systems. It also doesn't mean that having POSIX implemented makes things accessible to anyone, or that this 40-year-old standard is even relevant anymore.

> Look at for example JSLinux https://bellard.org/jslinux/ for the classic example

It doesn't mean that POSIX is implemented in the browser:

- it's basically an emulator running on top of some browser tech that runs linux.

- Linux is mostly, but not entirely POSIX-compliant

> we'll likely have a full WASM POSIX environment in your browser. Get some of the core tools like gcc, etc. prebuilt and you're good to go to just start building the world in your browser

This will literally never happen outside of some geek circles. If only for the simple reason: you'll have to download the entirety of Linux and its tools into the browser for every user.

3 years ago by d_tr

On the other hand, having a 70M sloc OS act as a bootloader for a similarly sized pile of hacks which is treated like an OS does not sound like a very exciting platform...

3 years ago by thunderbong

Submitted multiple times earlier. The thread with the most comments (227 comments) -

https://news.ycombinator.com/item?id=7605687

3 years ago by base698

Wat?

3 years ago by tedk-42

Javascript will never die, unless there's something to come along and either replace browsers or replace the scripting language used by browsers.

WASM is not a replacement to JavaScript and never will be. It's not even a damn language.

3 years ago by brutal_chaos_

I beg to differ. WASM, if it doesn't go off the rails with future revisions, will supplant JavaScript because of WASM being a compilation target. This, in effect, opens up any language to run on the web. JavaScript may not fully disappear, but usage will most likely greatly diminish.

3 years ago by tedk-42

A small JavaScript text file which can currently run natively on any browser will always be far superior to a binary WASM blob that's compiled.

WASM fills in a use case where you need to run highly performant code on a browser. It just so happens that you can write it in whatever language you choose.

3 years ago by qbasic_forever

There's no difference--that small JS text file gets compiled on the fly into platform-specific assembly language with today's JIT compilers in browsers. WASM is just skipping the text source step and giving browsers something they can compile directly.

I do agree it is a shame to lose direct insight to the text source code, but let's be honest the production JS shipped to browsers today is far, far from being human readable. It's minified and shrunk to the most small and incomprehensible degree to save bandwidth. View source and try to read and understand the JS on any big site like facebook.com, etc. and you won't get very far.

3 years ago by moron4hire

> WASM fills in a use case where you need to run highly performant code on a browser.

That's not exactly correct. It's possible to write equally performant code in just Javascript, by being careful to avoid certain features of the language (e.g. garbage collection, expando objects, dynamic types for variables, etc). Writing code in a stricter language like rust or c++ might make hitting those targets a little easier, but also using TypeScript can make it easier, too.

With the exception of loading, parsing and some JIT profiling, as WASM can potentially be smaller, and it can skip the majority of parsing and early profiling.

So WASM shouldn't be thought of as a tool for achieving speed. It should more be thought of as a means for running cross-platform code, for languages other than Javascript.

3 years ago by dmitriid

> will supplant JavaScript

Not until either of the two things happen:

- WASM gets GC and can work with DOM directly

- DOM is supplanted by canvas- and webgl-based libraries and frameworks

3 years ago by habibur

Those things will be built over time. It's a large field out there waiting to be explored.

3 years ago by Fergusonb

Javascript is still popular today even if you ignore the browser. Node is all over the place.

3 years ago by mysterydip

Javascript is the C of the web. We had a chance to learn from hindsight, and instead we made a different behemoth without realizing it.

3 years ago by kortex

Or is it? One thing JS and C have in common is a minimal core, kinda loosely defined, relatively easy to implement, that nonetheless lets you create powerful abstractions. I don't think it's a coincidence. I think anything more sophisticated and "well-built" ironically would have seen less initial traction, and lost first-mover advantage.

That JS wasn't a Lisp is kind of a pity, though. But again, Lisps have been around longer than almost any other language yet are still comparatively obscure, which tells me something inhibits their broader adoption.

3 years ago by bryanrasmussen

sad news for some people but now that it has been truly made independent of the browser JavaScript will never die, in the same way that every other programming language that has ever been used to build applications is still puttering around.

There are too many people with worthwhile JavaScript skills to service, too many companies who have things built in it and employees with those skills who will keep building things in it.

Maybe in 2050 there will be Cobol style posts on HN about JavaScript.

on edit: changed removed from to something more understandable

3 years ago by tbrownaw

I think it's not too unreasonable to say that this includes a successful prediction of webassembly.

3 years ago by maven29

Wasn't the contemporaneous asm.js already in commercial use in 2014?

3 years ago by m1kal

I tend to disagree. Asm.js is dying. Webassembly is not "everything on top of JavaScript". While higher layers can work the same as they could with asm.js (still in the web browser), it's not what GB predicted.

3 years ago by AprilArcus

asm.js already existed when this was written with AOT compilation support in Firefox

3 years ago by trixie_

I was thinking of a new 'open source website' concept, where you can call your website 'open source' if all the javascript behind it is unobfuscated/unminified, comments still there.

I would even like to take it a step further, an 'open source' OS where every binary has symbols available. The system can be stopped anywhere and the full stack trace is understandable.

3 years ago by junon

Symbols being stripped isn't usually for "open source" reasons. Neither is minification. It's done to reduce sizes.

A binary with symbols has tons of extra crust that is largely unnecessary. Even if you have them, what good does a stack trace do if you don't also have the source to fix it?

3 years ago by trixie_

The point is complete transparency into everything happening on your computer through installed application, operating system, or loaded website. Absolutely nothing would be running as an obfuscated binary.

The source to build it is not just 'open'. All running binaries would be able to be mapped to their corresponding source.

3 years ago by junon

Again, we don't "obfuscate" binaries to make them harder to decipher. We do it because debug symbols are massive and cause a lot of bloat and sometimes even performance hits.

If you want a system like that, just compile Linux and all of your tools in debug mode. But again, why do this when you can just recompile from source?

This is a gross misunderstanding of how computers work, I feel.

3 years ago by DaiPlusPlus

> Even if you have them, what good does a stack trace do if you don't also have the source to fix it?

It makes patching the binary yourself a heck of a lot easier. As is often the case with legacy enterprise software from a vendor now long-gone…

3 years ago by junon

Do you want full open source? Or allow enterprise software? pick one.

Daily Digest

Get a daily email with the the top stories from Hacker News. No spam, unsubscribe at any time.