The Tulip Creative Computer
I got so much joy from computers and I'd like my kids to have that kind of experience too without accidentally detouring into social media (which has my mind in a vice grip).
Still a couple of years away, but I think I'd like to evaluate this kind of device then and see if it's the right model to use.
Half the fun was not knowing how to do something. There was no other way to satisfy curiosity than to tinker endlessly and constantly seek out information. Stumbling upon unusually good programs made it seem like anything was possible regardless of the machine it ran on. Video games and the demoscene were like that for me, and now any modern machine really can run almost anything.
Programming can still be fun like that, but often in the context of existing ideas. My parents had similar feelings about new music and cars. The sense of wonder decreases when the bar is raised. That's not to say there isn't a ton left to explore, but that's the impression when curiosity is too easily satisfied. You have to keep up and find new ways to stay curious. We consume way more than we create these days.
The New Yorker had a memorable single panel comic by David Sipress with an old man saying "everything was better back when everything was worse". I just had to mention it.
I remember a large part of the fun was that we couldn't just look something up on the web - it didn't exist in our home in the eighties. Instead we'd pore over BASIC on floppies. Changing a thing, for example GRAVITY=0.1, and finding out the banana now flies almost straight up.
Or meticulously typing over the source code printed in hand-me-down magazines. Evenings of me and friend one typing the other reading out loud. Then way more evenings of finding all the typos and bugs. And then one or two evenings running the game we just "wrote" and get bored immediately. And start changing things.
This is how I learned programming and what has paid my bills for over 25 years now. There were few university careers for programming, but mostly, young-arrogant-me was like "well, I have learned myself programming, so I'd better follow a university program that teaches me stuff I don't already know XD".
The tinkering and creative part has been lost by now for me. I lament that. So I've put aside a fund, finishing off some contracts now and from this summer on, will do unpaid work of "creative coding". Making "art" with software - something I now do in spare time, fulltime. Because that tinkering is what drew me in. Not the scrum-rituals, spagetti-code-wrangling or layers of architectural enterprise abstractions. But the fun of nesting nested loops and seeing my name fly over the screen in weird patterns, or the joy of making the matrix printer play a "song" by letting it print weird ascii strings.
I've been working professionally now for over a decade, but got started long before that as a child. Despite the endless negative things I could say about the modern era, I don't feel like any of it impacts my enjoyment of my work or gets in the way of my creativity.
I think this is because the closest I've ever been to truly being alone with the machine is writing programs for my TI calculators, but even then I still had ticalc.org. Some programs on there were brilliant, but most were awful. It was the perfect balance for people my age at the time. Despite what people believe today, especially with their LLMs, I don't think the landscape has changed much in that regard. There's still a lot of awful code with few brilliant examples. That leaves room for me to work on new interesting stuff or improve what's there without having too much help spoiling it.
This kind of limited device is something I've been thinking about with respect to what interactions I want my children to have with computers. I remember when I was 9 years old and we got these computers at the lab at school and we wrote some LOGO and BASIC and it was a mind-blowing experience. We were drawing SQUARES! And we were making TRIANGLES of ASTERISKS! Hahaha, what a glorious thing that felt like.
Well, Minecraft 'redstone' works a bit like that?
Of course, it's embedded in a much bigger program, but I'm not sure that makes a difference to the kids?
Today, kids are surrounded by all kinds of tech. They see people interacting with tech in all kinds of ways from the moment they are cognizant. It's much harder to create that wow moment now.
What's toxic for childrens' development is the low bar of entertainment associated with something like an iPad connected to the Web: immediately, without any effort, you get entertained, and there are hundreds of movies that you can watch. Got bored by them? There's millions of funny TikTok clips to view and share with outhers? Or go to the App store and download a million games by just one click each; none of it is any EFFORT.
In contrast, in the 1980s, you had to put some effort in, in order to eventually harvest your reward:
You may have wanted to play a game, but you had to type in 16 pages of hexadecimal DATA statements encoding the machine code of the came before being able to type RUN.
You may have wanted to draw a plane and make it fly across the screen, but you had to calculate the decimal numbers representing the sprite's bitmap by adding some powers of two.
You may have wanted to write your own game in MOS 6510 assembler, but you had to learn how to code first and what the registers and opcodes are, and you had perhaps a magazine article and a book and an assembler or machine monitor on a tape, and no Web to look up solutions to problems on StackExchange. Heck, you may not even have known anyone to exchange info - coming from a small town of 8000, I didn't even know who else had a computer if anyone. So at the news agent, after bying my monthly homecomputer magazine, I hang around at the newsagent to check who bought the two only other copies of the same magazine and then wave my copy and - shy as I was - and introduce myself.
The youngsters today - due to no fault of their own - have it too easy, so they naturally don't see the point to make any effort to learn "deep tech" because there's thousands of lower-effort activities that they can pursue first to entertain.
A device that boots right into µPython and that can make music & be programmed to make music could be one such successful path...
when he is old enough to get some joy from it.
[1] https://grokipedia.com/page/philips-natuurkundig-laboratoriu...
I'm pretty sure I had seen some promotional material of theirs the last time I was in NL, so I didn't know they had gone out of business in 2008/2009 already.
Since the original company both changed names and was subsequently liquidated in bankruptcy nearly 20 years ago... that seems unlikely. There's only so many names out there, and occasionally they get fairly recycled.
If I were to build a synth a year ago I probably would have used Rust compiled to WASM and running in the browser. This thing has a lot of the same functionality, but you have about -30 million lines of code for the OS, -30MLOC for the browser, and another -30MLOC for Rust/LLVM.
And that doesn't even get in to the cost of materials or power savings.
Obviously it's not apples to apples but it really makes me wonder how much of that stack we need for most programs.
I wonder the same thing a lot. I also wonder how AI will fit into this problem.
The cost of abstraction is always dependencies.
https://github.com/coolcoder613eb/notepad
I haven't done too much coding with it, but I have on occasion fixed bugs in the text editor on the T-Deck itself.
It's seems like the Tulip could definitely be used for something like that, though you might have to write quite a bit of your own framework code in python.
Also love how absolutely minimal it is in size and if you didnt notice, the screen is a touchscreen. And they have a basic set of ui widgets.
Also interesting, the gfx lets you overlap sprites, bitmap, and text mode. You can tell the designers have lot of XP on 8-bit systems. And the bitmap is a little larger than the screen so you can do some superbitmap stuff. It's bot terribly larger, just a bit.
I havent been using it as much for its synth capabilites, ironically, but for making sequencers for external instruments. I believe it also has audio in...
Also the discord is helpful. 10/10
That it's built off Micropython is a huge deal. I've started using it by default for my hardware projects and it makes everything easier - writing drivers, playing with user interfaces, etc. Loads of regular Python libraries work and I can even grab them over the Internet. Like how I imagine it was running Forth or smth on embedded kits back in the day (ah maybe not the networking bit)
You can use Tulip to make music, code, art, games, or just write.
Am I wrong to think statements like these are just aspirational warm-and-fuzzies about the product without any real substance?
You could do all those things on anything, but they are typically incongruent with one another. If you are a beginner or a pro, you’re going to be better off doing it on a “more-standard” device.
This seems to be targeting the market of users with the following intersecting interests: * DIY hardware enthusiast * musician * python developer * maybe also wants graphics...? Seems a small segment to me, but I assume I'm missing something here.
The Pocket Operators have something similar (the KO at least, maybe the others). If you've written samples into them you want to preserve for playing live, you can snap a tab off and then they're read-only - no surprises on gig night.
I'm particularly sensitive to shallow critiques of new ways of computing, particularly those that encourage and enable people to be creative. Whether a project is successful or not, it's nice to see something that isn't a "bootup your general purpose comouter and then immediately open a browser" style of computing.
Attempting to get people to interact with the real world and also be creative should be commended.
I’ve been lulled into novel only-limit-is-your-imagination work environments that try to convince me to think they will be “transcendental” in my abilities.
A little while later, I run into software or hardware limitations only to face a physical malaise because I’d been troubleshooting a hardware or software problem for hours.
Take the “just write” angle. I do technical writing. I don’t trust that the built-in dictionary for that device is ever going to meet my spellcheck needs. I need mah Microsoft Word that I know how to navigate like the back of my hand, and I’m set.
Don’t promise a “Zombocom” device that doesn’t actually deliver.
* Boots almost instantly into a usable system
* I can read and understand every line of code that is running
* I can understand all of the hardware it's running on
We've gotten so used to computers not working. Weird stuff breaks all the time and even experts can only guess why beyond turning it off and on again, which takes minutes for most devices.
I dream of a world where we trust computers to work and be fast. It's completely possible, but step one is reducing complexity by several orders of magnitude.