Introducing
Yet Another Operating System
brought to you by the creator of
Yet Another Systems Language
mighty ambitious but he certainly has enough opinions on seemingly everything to give shape to something
this might get interesting, what does IQfy think?
Thalidomide Vintage Ad Shirt $22.14 |
Shopping Cart Returner Shirt $21.68 |
Thalidomide Vintage Ad Shirt $22.14 |
>Introducing
>Yet Another Operating System
>brought to you by the creator of
>Yet Another Systems Language
And it'll be yet another POSIX.
It's all so tiresome.
it will have a POSIX compatibility layer but as i understand it it wants to do it's own thing
https://drewdevault.com/2022/06/13/helios.html
These names suck. All the features should be named after waifus.
It's okay anon. I'm working on an OS and I don't even know what POSIX is so I can pretty much guarantee it won't be compatible. Currently making my own version of C tentatively called "D For Dragon!". Target machines are the Clockwork Pi running A-06 and R-01 cores. I will target any other Clockwork Pi core I can get my hands on that works with the 3.14 board.
>All the features should be named after waifus.
about time for sneedOS?
I haven't had people vote on the name yet (because I don't know a good place for anonymous polls). If IQfy calls it SneedOS, my body is ready.
the actual naming issue will be which waifus and the order of them
Something will be named for Tohru and something will be named for Senko. Other than that I'll probably just make polls and let IQfy choose.
I'm still a long ways off of having anything useful to name though. My hardware hasn't arrived yet, and D For Dragon! isn't done yet.
https://en.wikipedia.org/wiki/Vue.js#History
>Quintessential Quintuplets
superb taste
>software updates
>you get an anime recommendation for free
Based!?!
just checking, you're aware there's already a D programming language, right?
>just checking, you're aware there's already a D programming language, right?
I hope Maid-D will be the superior D
My D is the superior D
You haven't even begun the OS yet?
What language are you planning to write it in, C or asm? And which assembler/compiler?
I spent a few hours figuring out how to be able to build the image from multiple files since before I was building it from a single big C file but that was becoming unmanageable.
It works but I get weird erratic errors, for example sometimes arguments aren't passed correctly when calling other files, extern variables aren't actually shared between files, etc.
I wonder it it's bugs in gcc or I am doing something stupid.
Still haven't gotten to disk IO, not sure how I am going to do that.
>You haven't even begun the OS yet?
I made a small bootloader/kernel for raspi 4 based on this:
https://github.com/isometimes/rpi4-osdev/tree/master/part5-framebuffer
But using the raspi 4 as my daily driver and trying to do bare metal with it at the same time was a pain in the ass so I ordered a DevTerm kit and an extra RISC-V core.
>What language are you planning to write it in, C or asm? And which assembler/compiler?
I'm making my own compiler and my own assembler. I've made a few interpreted languages before, and a couple compiled ones, but I always made my compilers with a compiler compiler. Consequently I felt my knowledge was superficial so I started working through this book:
https://usa1lib.org/book/6153326/368d38
The language will be C-esque. My plan is to literally have it spit out an assembly code file that needs to be fed to an assembler. I've never made an assembler before but I found this to study:
https://github.com/lwiest/Atari6502Assembler
I should have some time because the Clockwork Pi has 60 business days of ship time away.
>I spent a few hours figuring out how to be able to build the image from multiple files since before I was building it from a single big C file but that was becoming unmanageable.
>It works but I get weird erratic errors, for example sometimes arguments aren't passed correctly when calling other files, extern variables aren't actually shared between files, etc.
Can I look at your source? I will try to solve your bug in hopes of learning more about low level. What machine are you targeting?
>I wonder it it's bugs in gcc or I am doing something stupid.
gcc is a good compiler. This stuff is just hard. If it wasn't github would have as many copy-paste kernel experiments as it has copy-paste Angular bootcamp capstone projects.
>Still haven't gotten to disk IO, not sure how I am going to do that.
I haven't gotten that far but I will look through my sources and see if I have anything helpful.
>The language will be C-esque
same shit as always
What would you recommend? I'm only doing a C-like because I'm familiar with C and I have example code for Small C and this is my first compiled language without use of a compiler compiler.
It can always be something later if you have a better idea?
Read this: https://dspace.mit.edu/handle/1721.1/7286
>lisp machines
>oop
NO!
Enjoy your buffer overflows and bloated code.
>Enjoy your buffer overflows and bloated code.
>bloated code
heh
Enjoy no liveliness then.
>Enjoy no liveliness then.
?
Live programming friendliness.
"live programming" is overrated anyways
Thanks for the suggestion MIT fren. I have heard of LISP machines but have no real knowledge of them. In college, they tried to make me learn CLISP for AI but I got filtered by counting parenthesis and had to turn my homework in in Haskell.
What makes you like these MIT LISP machines?
>What makes you like these MIT LISP machines?
The unity of design between the supporting hardware, the programming language and the run-time environment (including the operating system). The unrivaled security against the vast majority of vulnerabilities. The high ratio of functionality per line of code.
Interesting. I'll give the paper a read.
I've never heard of live programming. What is it and what is it good for?
imagine programming without needing to recompile or while the programming is running.
That sounds pretty cool but what's the use-case? It seems like it would be a very complicated feature to implement.
>That sounds pretty cool but what's the use-case?
Prototyping, interactive development, etc, but whether this is such a huge benefit compared to a functional language is debatable.
It seems like it would be a very complicated feature to implement.
it's not that complicated, search about eval and apply.
The use case is interactive development, which is more natural to some people.
Hey you're still here. I'm home now and will see if I have 1 or 2 hours to look into the gcc issues I mentioned earlier.
I tried to post but I had some issues posting because IQfy fricking silently ate my posts when trying to post with a fake user agent that I had to configure for some other stuff.
Just now gonna get into it for about an hour.
I remember now, the problem I had was transmitting data between the interrupt handling file with the ISR for the keyboard input and the rest of the code.
Well frick me. Couldn't solve it, will have to lool at the assembly tomorrow and keep guessing, which sucks.
I'm going to work on it again today while listening to https://invidious.snopyta.org/watch?v=L2QTtdeL3dE for extra rage, wish me luck
IT WORKS!!!!!
The thing I was doing wrong was an off by one error in the memory offset that I was telling the linker the code was going to be loaded at.
I'm surprised that it worked at all, holy shit. It'd had been faster if it didn't work outright.
I had no chance of ever fixing it just looking at the assembly. What made me realize it was happening was that I figured out how to do debugging using qemu and gdb. I didn't want to cheat in that way but now with a debugger it's going to be much, much easier.
neat
It worked with the wrong address because gcc was generating position independent code by default, but some things apparently still have to be based on absolute addresses. And those were the things that were failing.
Glad you solved it fren. I was asleep again (and am likely soon to sleep again!)
What's next? Disk IO?
Maybe. But I don't know if I should go with AHCI from the beginning or drop back to real mode and use BIOS routines or PIO mode.
You can change your program while it's running, and if you make a mistake it won't explode. Here's a good article about it: https://mikelevins.github.io/posts/2020-12-18-repl-driven/
Thanks MIT fren. I read the article. I am still working through the paper because it's like 50 pages long.
Would anything stop someone from building a REPL-driven environment for any language they wanted? Could someone theoretically build a REPL environment for C?
Sorry if my question is dumb. I have minimal (and old!) experience with CLISP and no experience whatsoever with Smalltalk.
Good to see you again, fren.
>Would anything stop someone from building a REPL-driven environment for any language they wanted? Could someone theoretically build a REPL environment for C?
terry did this
In few years TempleOS will be studied as a piece of marvelous engineering.
You act like OS researchers haven't known about it since the late 2000s
morons overhyping Terry's work, understand that 99% of the people who want/need to know already know.
Me who actually reads about OS design principles and explores various implementations stuck in the middle of morons
morons under hyping Terry's work, not understanding that making a real language (C) the shell (not a fricking repl) completely changes how the operating system is used. Set /bin/sh to a C interpreter and see what happens.
>Set /bin/sh to a C interpreter and see what happens.
Already done in 1978, it was very popular
https://en.wikipedia.org/wiki/C_shell
Terry did nothing original
>https://en.wikipedia.org/wiki/C_shell
not the same thing, moron
>A shell with C style syntax
It is the same thing sorry, stay mad
Yes terry was moronic
>It is the same thing sorry, stay mad
you are more stupid than fricking terry davis
Keep being mad
Did I not tell you that he was moronic? I did call it in my post
It's easy to spot a non-programmer in this board. Terry did nothing special, he is only memed here because he used the magic word Black person.
Maybe did no new innovations but resulted system is interesting.
http://www.codersnotes.com/notes/a-constructive-look-at-templeos/
it's the same thing that already existed in the 80s, it's just interesting because of terry davis, who was a living meme
Look homosexuals, Terry's C REPL is not some marvel of engineering.
You can make your own with a 10 line bash script that calls nano, sends the code to gcc and executes the compiled binary.
Thing is HolyC isn't only the language in which the system is written in but also the one used in shell.
Making programs filled with boilerplate isn't really the same.
Live modifiable progrms. This is the kind of stuff that Smalltalk and Lisp does. Can it be done with C?
There really isn't any untractable reason you can't recompile on the fly with C. It just has never been done before.
The problems faced are really the same ones faced when trying to patch proprietary binaries, except easier because you already have the source for the existing code.
There is the problem of inlining and other similar optimizations that would break when substituting a function's code but in principle those can also be recompiled based on the new code if they were kept track of by the compiler.
The harderst problem is that if the code you need to add is bigger than the pre-existing code, you need to add your code at the end of the program and modify ALL the callers to point to the new address. And some of those callers might be dynamically building the address somehow, so you can't just grep for all the callers and modify the address that they point to.
The smartest way to do it would probably be to leave a stub at the original address that jumps to the modified version of the funciton.
the easiest way of doing this in c (or any other compiled language) is to disable optimizations and leave some space or padding at the end of each subroutine. that way you don't need to reallocate the existing code or update the callers
BUT if you're going to write your own os and pl, why not just use something like a "procedure linkage table" for the program functions that you can update while the program runs? you're probably going to have to implement @plt lookups anyway
Yeah. Doesn't solve the problem with the stack inconsistency when modifying functions that are already in the stack.
C doesn't have read, eval or print.
https://root.cern/cling/
>LISP
UH OH, BRAIN DEAD
Why do some people hate LISP? I am completely foreign to these arguments. I always viewed LISP as an AI language and I've never really been interested in AI or machine learning.
lisp isn't an AI langauge.
People write most ML systems using Python and C++.
Lisp is a tool with no usage. It's very conception was born from philosophical masturbation rather than actual practical purpose.
As such, it is fundamentally slow and tedious.
Even the dedicated LISP machines were slow compared to their contemporaries.
Stop lying, Common Lisp is magnitudes faster than Python and competes with C++ if you use optional types: http://www.iaeng.org/IJCS/issues_v32/issue_4/IJCS_32_4_19.pdf
Mine is for x86 and I'm using QEMU, but the problem is probably with the generated assembly itself. The extern variables are referred to in the assembler with the definition relative to the instruction pointer but in the other files they're referred to by an absolute address IIRC and I think they don't match.
I'll keep investigating when I get home.
Maybe it's a problem with the custom linker script that I'm using.
I'll see if I can get a minimally reproducible example when I get home.
gcc is a good compiler but it's not often used with the -ffreestanding which I need to compile code to run without an OS.
Looking forward to IQfy's next dunning kruger project
Thanks frens. I'll make a thread whenever the compiler is done with a link to the source for it on pastebin.
>using the raspi 4 as my daily driver
Why you make yourself suffer?
Please gib port to a04. I am gimped because i wanted better power efficiency. Everyone only does a06
Challenge accepted fren. I just bought an A04 core to work on. All my equipment is still waiting for shipping though, so it will be a while before I can experiment on the bare metal.
I like the idea of the LISP machine that the MITs keep showing me, but I am not sure I am familiar enough with LISP to make a self-hosting LISP compiler. Especially given that I am working on my first self-hosting compiler and that will require writing the compiler in it's own language without use of a compiler compiler which is something I've never done before.
If I finish "D! For Dragon" (or maybe "MAID-D"?) and my hardware has still not arrived then I will try to create some sort of "MAID-LISP" (or maybe "L! for Lucoa") with MAID-D to try to prove it's usefulness before diving into making some kernels for the DevTerm.
While lisp machines were cool, the thing now is pure functional programming.
FPBP. There are like a dozen of POSIX operating systems already.
>every component is named after a god
and that's how I know it'll be garbage
Named after celestial bodies you dumbass
then why gaia instead of terra
>In Greek mythology, Gaia [...] is the personification of the Earth [...]. Her equivalent in the Roman pantheon was Terra.
he's using the greek names
no he's not
you're right, I only looked at the last two
wtf is he doin
making shit up as he goes
he's putting all this info about his "planned" work out there first in a futile effort to prevent his adhd- and masturbation-addled brain from ditching this project the second real code needs to be written
>In Greek mythology
so named after gods
Terra comes from roman pantheon dumbass.
but the project isn't named terra
is named gaia, and areas, and helios
I was pointing out that for some reason you believed terra to be okay but not gaia, even though they are both earth
>after celestial bodies
and those gays don't even know where the names come from
I find this hard to believe, since all of those objects predate life on Earth.
looks fun!
>Venus: real-world driver collection
read: linux drivers with a shim
which, by the way, completely disproves the need for a microkernel, since the drivers are already isolated enough with proper interfaces to be able to run on a completely foreign base
The point of a microkernel isn't portability, it's to prevent shitty misbehaving drivers from messing with the other parts of the kernel.
hare sucks so hareos is going to suck
Will it support Wayland?
lol
Stop bullying Drew NOW!
https://drewdevault.com/2022/05/30/bleh.html
sounds like its immoral to claim to be adding a borrow checker later to your memelang and sourcehut sucks
No, kys wayland shitter
>I have made no shortage of mistakes, and there are plenty of hurt feelings which can be laid at my feet. I am regretful for my mistakes, and I have worked actively to improve. I think that it has been working. Perhaps that’s arrogant of me to presume, but I’m not sure what else to do. Must I resign myself to my fate for stupid comments I made years ago? I’m sorry, and I’ve been working to do better. Can I have another chance?
Iktfb
But with an ex-gf
Drew DeVault is an insufferable child who believes himself to be much smarter than he is just because he reimplemented existing solutions. Years ago, he went as far as spamming porn on a forum because some other guy said something he disagreed with (I can't find the actual thread right now, but you can probably find it yourself if you look it up on a search engine).
I have a lot more respect for people like Terry, who, while clearly insane, at least did something that was actually original and truly personal.
>reimplemented existing solutions
this is what all these clowns do, they reimplement something poorly, then think they have iqs over 120. unlike my cow orkers, im allowed to call drew a Black person monkey
he usually does that every time he's kicked out of a project, it's a shame that most of those irc channels have no archives. that's one thing that I have to give to drew, he knows where to sperg and how to cover his tracks. where are the kiwifarmers when you need them?
Nice. GL to him, he's just only starting to get into the difficult stuff.
don't be ridiculous, thinking of the names is by far the hardest part
I read about a third of MIT's paper then I fell asleep. Back to the paper now. I have questions but I'm gonna finish it first to see if the answers are just later in the paper. I like that LISP can't have a buffer overflow and I'm watching a video by another MIT that explains how other languages can be embedded in LISP and treated as extensions of each other. This stuff is fascinating.
He already failed that by not using waifus.
It could just be added though?
Sorry fren, I was asleep.
Thanks fren. It's interesting to see that this has been done with C.
why not just work on something new or some area where open source is present? Like electron microscope software I would love a open source version of that the usual stuff I have to use at university is fricking garbage sadly its the only high tech option so I forced to use windows xp.
because working on new stuff is hard
re implementing existent stuff poorly is easier to do
>why not just work on something new or some area where open source is present? Like electron microscope software I would love a open source version of that
Because most programmers don't need such tools??? You're currently in the perfect position to put your money where your mouth is and actually start writing some code.
I dont have the knowledge to write dirver code otherwise I would do it. And yes people will need a open source alternative once the current SEM gen is affordable for 20k on ebay like the old jeol that go for 5-10k € you can make your own sensors and microchips with that. Or open a small lab taking requests from industry. Its a good side business when one does his phd.
I don't have access to any of that equipment or knowledge of what it does. You're the first person I've ever heard mention wanting open source microscope software.
I think the best way for you to get such a thing would be to make open source microscope hardware. That way people can openly study the machine and play with code for it.
Thanks for the film fren.
test
>hurr let's pick a bunch of names that are as ambiguous as fricking possible so searching for anything related to it turns into an absolute pain
why are code monkeys like this?
why lisp when prolog exists?
Another way to do it would be to straight up generate a new binary image, overwrite all the code in the text section of the process in memory with it and modify the instruction pointer to point to the equivalent instruction in the modified version of the code.
Another big problem when doing this live is that you need to ensure that all push and pop instructions remain equal. Adding a new variable anywhere in the code for the stack call will break everything because it may add a new pop at the beginning of the function, but the code is stopped in the middle of the funciton, but the modified code has one more pop than the original code had pushes, so when you return it will try to do one more pop than it did pushes and the stack will break. This is a really hard problem and I don't know how you would solve it. Maybe just not allowing the modification of functions that are part of the call stack.
It's not like you can map the modified variables to the stack because the meaning of the data on the stack might be different depending on what path the program took. How does a debugger build a call stack anyway? I'm not sure.
Related talk https://invidious.snopyta.org/watch?v=LwicN2u6Dro
Good luck to drew. Big undertaking and most OSs get no user base.
Drew is a lolcow
Thanks for marking for me the only posts that matter in this shitty thread.
(You)
https://urbit.org/docs/arvo/overview
Does it run baremetal yet?
How do you learn Hoon? Every time I want to get into some tutorial I got lost in the writings and it feels like a fever dream.
https://hooniversity.org/
lol these troons really need to get a life/job.
Absolutely megabased!
I am an SBC enthusiast.
I don't have $20k to spend on equipment I don't know how to use.
coom
>Anybody else excited for slight variation on Unix using a slight variation on C
No.
The really sad and hilarious thing is that this might have actually be a fun project idea if it wasn't being done in his memelang.
>ares
just