Basic Linux Boot On Open Graphics Card 177
David Vuorio writes "The Open Graphics Project aims to develop a fully open-source graphics
card; all specs, designs, and source code are released under Free
licenses. Right now, FPGAs (large-scale reprogrammable chips) are used
to build a development platform called OGD1. They've just completed
an alpha version of legacy VGA emulation, apparently
not an easy feat.
This YouTube clip
shows Gentoo booting up in text mode, with OGD1 acting as the primary display.
The Linux Fund is receiving donations, so that
ten OGD1 boards can be bought (at cost) for developers. Also, the FSF
shows their interest by asking
volunteers to help with the OGP wiki."
Do we want an open source video card? (Score:2)
I can understand open sourcing the software, but can someone explain the benefits of opening the hardware as well?
Re:Do we want an open source video card? (Score:4, Insightful)
Do you want to be tied to a vendor?
If the answer is no, then you understand. if you don't mind being tied to a vendor and at their mercy, then i guess the answer for you is that there is no benefit.
Open hardware has the same value as open software.
Yeah but FOSS is a vendor too. (Score:3, Insightful)
If the answer is no, then you understand. if you don't mind being tied to a vendor and at their mercy, then i guess the answer for you is that there is no benefit
Yeah, but open vendors are vendors too. That's the thing. Basically, what you are trying to do is suppress innovation for the sake of commoditization, and that's not a proposition that people want to make.
Re: (Score:2)
But if you have the HDL code you can choose anyone you want to produce it, or use FPGAs
trade innovation for commoditization? (Score:2)
What I want to know is why we put up with proprietary in the CPU itself.
I mean, if we really want free and open, why aren't we pushing for a truly free and open (and maybe even elegant) CPU, instead of being satisfied with the pseudo-open x86 junk?
Re: (Score:2)
Re: (Score:2)
Aside from being an awesome arch, Sparc is an open standard.
Re: (Score:2)
Aside from being an awesome arch, Sparc is an open standard.
Yes, but you could make the case that AMD64 and POWER are both better.
Re: (Score:2)
Do you want to be tied to a vendor?
If the answer is no, then you understand. if you don't mind being tied to a vendor and at their mercy, then i guess the answer for you is that there is no benefit.
Open hardware has the same value as open software.
Right... every time I run a high-end game on my propriatary 3D driver I have to stop and flagellate myself halfway though to dull the guilt... not...
Drivers (Score:5, Informative)
If I don't like my NVidia card, I can move to a competitor's chipset.
Only if the competitor is friendly to the free software community. There are plenty of hardware makers that have declined the free software community's requests for low-level specifications useful for writing free drivers.
If profit from evil outweighs profit from good... (Score:2)
Which means that there is an incentive for those companies to be friendly to the free software community, because the competitors that are FOSS-friendly will (hopefully) do better.
Path A: Use know-how subject to third-party patents, copyrights, and trade secrets to improve the performance of your product. Your product sells well to the mass market but poorly to free software enthusiasts.
Path B: Do not use know-how subject to third-party patents, copyrights, and trade secrets to improve the performance of your product. Your product sells well to free software enthusiasts but poorly to the mass market.
If Path B produces less profit than Path A, there is no such incentive for a com
Re: (Score:3, Interesting)
Also I guess
Re: (Score:3, Insightful)
Well obviously it's of academic interest. American consumers have sunk billions into video card research and for the most part the implementations are shrouded in mystery locked up in labs.
The problem with this line is that the American consumers may have sunk billions into buying video cards, they were never promised any or all the knowledge required to build one. In other words, you bought a product, not the product design process process, and your line seems to suggest confusion on that part.
Re:Do we want an open source video card? (Score:5, Informative)
and your line seems to suggest confusion on that part.
Doesn't seem that way to me. He's just pointing out that when compared to other electronics, we have shockingly little info available.
Even for CPUs, there are fully documented "open-source" microcontrollers available, but for GPUs there's basically nothing. It is a big mystery, how it's all done. And now we've gone so far that GPUs are doing incredible things like juggling 10,000 threads that manage all the shading when, you fire up a game.
nVidia and ATI stated GPUs are many times more complex than CPUs, and I fully believe them.
Re: (Score:2)
Its not that big of a mystery, nor does it matter that it does in the grand scheme of things.
There are AVR microcontrollers that can output a VGA signal via bitbanging, so that part is obviously simple enough.
You don't really HAVE to continue legacy VGA support, make a new standard thats a good solid standard and will in some way benifit pc manufactures and you'll have bioses and OSes that support them shortly afterwords.
The problem isn't the technology, the problem is starting from scratch and lasting long
Re: (Score:3, Funny)
This is why I love Slashdot: lazy Saturday, talking about video cards, not even 100 posts up yet...
BAM! Godwin!
magical future (Score:2)
How long can this "modern" society continue to support this out-dated concept of "billions of dollars" having some sort of meaning?
"Billions of dollars" is basically the excuse for all the things that we think are the causes of, for example, global warming, and the current slump in the economy.
It's also the excuse for the existence of Microsoft and INTEL, and for the tons of junk they produce, most of which adds unnecessarily to the landfills every day.
No, we wouldn't have processors this fast without INTEL
Re: (Score:2)
No, nothing fancy. Humans just tasted the advantage of Open. They don't give away free shit without getting anything in return, like an open graphics card that works the way they want it.
Also notice the element of changing something for the better; power.
Re: (Score:2)
Well, because corrupt governments have this instinct to attempt self-preservation in destructive ways. (Self-destructive ways, in the end, for all that.)
Re:Do we want an open source video card? (Score:5, Insightful)
There's not that much mystery about the things. Making a VGA emulator in an FPGA is no big deal. If all you implemented was text mode and mode 13H, it would probably boot Linux. Getting to a card that runs OpenGL is a big job, but not out of reach. The pipeline is well understood, and there are software implementations to look at. As you get to later versions of Direct-X, it gets tougher, because Microsoft controls the documentation.
But the real problem is that you'll never get anything like the performance of current generation 3D boards with an FPGA. There aren't anywhere near enough gates. You need custom silicon.
Re: (Score:2)
Re: (Score:3, Insightful)
Implementing OpenGL efficiently isn't just a "big job" it's essentially the entire field of computer graphics hardware.
It's understood, though. And you can do it in sections. Start with an OpenGL implementation that does tessellation, geometry transforms, and fill in software. Build something that just does the fill part. (That's 1995 technology in PC graphics.) Then add the 4x4 multiplier and do the geometry on the board (that's 1998 technology on the PC, 1985 technology for SGI.). Once all that'
Re: (Score:2)
Re: (Score:3, Interesting)
If anything, a modern GPU is easier to implement than an older one. Modern APIs basically ditch the entire fixed-function pipeline in favour of an entirely programmable one. The fixed-function OpenGL 1-style pipeline is implemented entirely in the drivers. The GPU is just a very fast arithmetic engine. Some are SIMD architectures, but a lot of the newer ones are abandoning even this and just having a lot of parallel ALUs. You could probably take an FPU design from OpenCores.org, stamp 128 of them on a
Re: (Score:2)
yeah, cos it's 30 years
Re: (Score:2)
American consumers have sunk billions in buying video cards on which to play games. Then, the companies that designed the components for those video cards invested in video card research.
It's not exactly the same thing.
Re: (Score:2)
funny thing about NDAs -- (Score:3, Interesting)
The tighter the NDA, the more you should suspect that the underlying tech is not rocket science.
So to speak.
In this case, graphics is not that hard. Fast graphics, even, is not all that hard.
The cruft is the thing that is hard. Mechanisms to manage (emulate) the cruft are about the only thing non-obvious enough to get a good patent on, and much of that, if shown to the light of day, will be seen to be covered by prior art.
A big part of the reason INTEL got so excited about ray-tracing was that they were/are
Re: (Score:2)
Nitpick: Hyposexuality means you're the opposite of a nympho - you don't desire sex AT ALL.
I think "hypersexuality" is the word you were going for. :)
Re: (Score:2)
How do you think they had time to learn all of those things? Not having sex sounds about right.
Re: (Score:2)
Re:Do we want an open source video card? (Score:4, Interesting)
Sorry to get a bit crazy here, but imagine a world with technology like that in Ghost in the Shell. I would not go getting such implants and additions if I did not and could not have complete control and understanding over the stuff. This type of project is a small step in maintaining individual control.
Re:Do we want an open source video card? (Score:5, Insightful)
When a piece of music, or a play, enters the public domain, there are effects beneficial to the public:
These have analogs here. Having a Free video card design means that low-end video cards can become that much cheaper (and that there's more room for new entrants into the very-low-end market), and that there's a common, available base on which new and innovative work can be done.
Re: (Score:3, Insightful)
I hope this group of engineers can succeed in producing an open board that eventually provides high-end graphics capabilities.
Re: (Score:2)
Yes. Ok let's take a look at Intel Graphics, OK? So they have free software drivers. Cool and all but what if you wanted to write a Galium3D (totally different style kinda) driver for it? Well you can't, because you do not know the hardware.
How about you want to reprogram the GPU itself? It would be kinda nice to know what the graphics card is made of.
Re: (Score:2)
That assumes that this open graphics card is going to be anywhere near the commercial graphics chips any time soon. If you're already content with obsolete technology by going to VGA, then open source driver support isn't going to be a problem.
A milestone? (Score:4, Insightful)
Also, they can't possibly approach competing with NVidia or ATI and I doubt anyone's going to shell out a billion dollars to build a plant to make their cards. If they're just playing around with FGPAs then this isn't really a serious "Open Graphics Card"
Re: (Score:3, Funny)
Re:A milestone? (Score:5, Funny)
Re:A milestone? (Score:5, Interesting)
Re: (Score:2)
Re: (Score:3, Insightful)
The best answer I've read so far regarding the "why" for this was simply: because we can.
There is a reason people pay so much for other people to make computers (sw and hw). It is so they don't need to worry about it.
I'm all for the Open Whatever project. Simply "because we can". It is like climbing a mountain.
And hey, who knows what we will see on the other side once we reach the summit.
Re:A milestone? (Score:5, Informative)
The /. post gives the wrong impression about the VGA implementation - it was difficult because they wanted to implement it in a extremely simple fashion, not because VGA itself is complex
Re:A milestone? (Score:5, Interesting)
Also, they can't possibly approach competing with NVidia or ATI
If you are running Windows on an x86 box, this may be true. Move to FreeBSD on an ARM embedded display and getting the drivers becomes dicey. Want to optimize medical imaging requiring 48 bit color rather than a typical game? Bet you will have better luck with an FPGA than an off the shelf card.
if you want 48-bit color... (Score:2)
You're gonna need 16-bit D/A. And you don't do that with FPGAs. What you really need is a 48-bit RAMDAC. The rest is easy, you don't even need any GPU acceleration if it would be too difficult to work it out, just use the CPU.
I have written display drivers for several ARM embedded devices. I find it pretty easy, because when you make a system like that, you can get the entire spec for the display from the display controller vendor, something you can't get from NVidia or ATI.
larrabee open? (Score:2)
huh? did I miss something?
Re: (Score:2)
VGA is reasonably well documented, although a lot of the quirks aren't. It is, however, a horrible and painful design which had a zillion rarely-used features.
Re: (Score:2)
well yes and no - chunks of VGA are well documented - but it's a register spec that doesn't say what happens when you deviate from the documented register values - over the years various programmers have stepped outside the spec and gone their own way (doom, microsoft, ....) enough people have done it that unless your design does the right thing in all these architectural black holes you're not 'compatible'
Re:A milestone? (Score:5, Informative)
A lot of times, FPGAs are used for development. Once the design is proven, then you can go to etching into silicon. Almost nobody builds a fab for one chip, the good news is that chip fabs can make numerous different kinds of chips. There are many fabs that are willing to take any design that comes their way, as long as the money is there.
Re: (Score:2)
Both ATI and Nvidia are fabless companies. They only design chips and then send the specs to a plant in China.
Re: (Score:2)
ATI? Even now that they are part of AMD?
Can you pass me some sort of citation/link?
Re: (Score:2)
http://en.wikipedia.org/wiki/ATI_Technologies [wikipedia.org]
As for AMD, they also want to outsource [google.com] their chip production, I think they already spun off all production facilities into a new company [wikipedia.org].
Perhaps... (Score:2)
So if companies and individuals worldwide are willing to free themselves from proprietary graphics card designs so that their software will work better, then they're probably willing to
Re: (Score:2)
I think you're making an unwarranted connection between how much it would have cost to build Linux commercially and how much Linux is actually worth. Though it may have taken a billion dollars of work, if it only carves out 500 million of wealth in its lifetime then investors certain
Could be... (Score:2)
As far as risk is concerned, most companies and individuals have taken many calculated risks since there is no guarantee of profits. This is true for any venture (oth
Re: (Score:2)
Re: (Score:2)
Linux is communism build upon a capitalistic foundation, namely the only way to achieve real communism by choice and therefore maybe a small step in human society, but huge leap for mankind.
This 'virus' is now spreading to the physical, namely something you can trade. From service to goods, putting capitalism to good use, without needing a government for flawed communism; creating and freely sharing, just because we can and out of free will.
I see all sorts of tiny things without real visible impact happenin
Re: (Score:3, Informative)
They are only "non-standard" in popular belief, but very well THE STANDARD.
It is true th
Re: (Score:3, Informative)
All you pups that don't remember VGA modelines and frazzling your monitor with the wrong XFree settings.
VGA only goes up to 640x480x16 with 256k RAM, after that anything goes. IBM lost their grip on the market when they wrong footed themselves trying to force end users on to their MCA bus and XGA. MB / IOcard cloners started to design their own cards. Vesa Local Bus was born and MCA was largely ignored.
Intel became the trend setter and (after EISA) PCI became the BUS and 3Dfx stole the gamer market from und
some kind of useful background (Score:5, Informative)
Re: (Score:2)
I approve.
Re: (Score:3, Interesting)
So does the host interface part reside in the Lattice FPGA, in 10K LUTs?
Re: (Score:3, Informative)
Yes.
The XP10 contains these parts:
- PCI
- Microcontroller that does VGA
- PROM interfaces
The S3 is mostly empty and contains these parts:
- Memory controller
- Video controller
- Room for a graphics engine
Full open-source stack (Score:2)
Not really worthwhile (Score:2)
unless the chips and expansion card circuit boards can be made in masses to make them more affordable. You have to make them in mass quantities in order to drive the cost of them down.
Nobody wants to buy a $300 Open Source graphic card, when a closed source graphic card costs $100 and has better graphics.
Still this is a good idea, instead of Chinese companies stealing closed source ideas and violating IP laws, they can make open source graphic cards using the open source license and be legal. I would like t
worthwhile (Score:2)
(Setting aside the idea that this should only be good enough for the undersirables in developing country X ...)
For now, I'm not thinking about a game video controller.
I'm thinking about an LCD video controller for a pocket calculator that costs less than JPY 5,000 and runs dc if I ask it to. And gforth and vi. Oh, and bash and gcc, of course.
And maybe I can plug in an SD with an English--Japanese--Spanish--Korean--etc. dictionary on it.
BIOS (Score:3, Interesting)
As a person who actually did proprietary BIOS development, I can tell you that:
1. It's possible to make BIOS boot without VGA.
2. It's usually a massive pain in the neck.
One of my projects involved making one of the popular proprietary BIOSes boot on custom x86 hardware that lacked VGA. On the development board (where I could attach and remove PCI VGA card) all it took was setting console redirection in CMOS setup, turning the computer off, removing VGA and booting it again. On production board (with no built-in graphics adapter and no PCI slots) I also had to modify BIOS so console redirection was on by default.
Then I had to spend weeks rewriting console redirection code to make it work properly -- I had to rely on console messages when debugging custom hardware support, and existing implementation was way too crude to actually display all messages those. Existing implementations merely allocate "VGA" buffer in memory, occasionally check it for changes and send the updates to the serial port using VT100 escape sequences. "Occasionally" is a key word here.
Uses of OpenGraphics (Score:3, Interesting)
To all of those who keep saying this project is useless because it will never compete with NVIDIA/ATI:
Although I agree with those who cite "because we can" as a perfectly valid reason, it is not the only reason. The lack of high quality open source 3D graphics drivers has long been an issue with desktop applications of Linux/*BSD, and while NVIDIA's closed drivers do fairly well they still limit the options of the open source community. If a bug in those drivers is responsible for a crash, it's up to NVIDIA to do something about it. The open source community is prohibited from fixing it. Remember the story about how Stallman reacted when he couldn't fix the code for a printer?
Plus, who knows what optimizations might be possible if the major open source toolkit devs could sit down with the X.org guys and the OpenGraphics folk and really start to optimize the whole stack on top of an open graphics card? It wouldn't be up to the standards of NVIDIA or ATI for 3D games, but in this day and age you need decent 3D graphics for a LOT more than that! Scientific apps, Graphics applications, CAD applications... even advanced desktop widget features can take advantage of those abilities if they are present. What if ALL the open source apps that need "good but not necessarily top of the line" graphics card support could suddenly get rock solid, flexible support on an open card?
The paradigm for graphics cards has been "whoever can give the most features for the newest game wins" for a long time now. But there is another scenario - what if maturity starts becoming more important for a lot of applications? For many years, people were willing to toss out their desktop computer and replace it with one that was "faster" because usability improved. Then the hardware reached a "fast enough" point and the insane replacement pace slowed. For some specialized applications, there is no such thing as a computer that is "fast enough" but for a LOT (perhaps the grand majority) of users that point is dictated by what is needed to run their preferred software well. If the open source world can get their applications running very well atop OpenGraphics, who cares what the benchmark performance is? If the user experience is top notch for everything except the "latest and greatest games" (which usually aren't open source games, bty - most of the most advanced open source games are using variations on the quake engines, which being open source could certainly be tuned for the new card) and that experience is better BECAUSE THE CARD IS OPEN AND OPEN SOURCE IS SUPPORTING IT WELL it will have a market that NVIDIA and ATI can't hope to touch. Perhaps not a huge market, but niche products do well all the time.
There is one final scenario, which is the open nature of this board's design allowing virtually all motherboard manufactures to include it as a default graphics option on their boards at very low cost. That might allow for logic that uses that card for most things and fires up the newest cards specifically for games or other "high demand" applications if someone has one installed (presumably installed because they do have a specific need for the raw power). This would mean broader GOOD support for Linux graphical capabilities across a wide span of hardware as part of the cost of the motherboard, which is a Very Good Thing.
Remember, this is not actually a graphics card! (Score:2)
This is a graphics card DEVELOPMENT PLATFORM. That implies a few things:
(1) This is a proving ground for designs that could be turned into a fast ASIC.
(2) Graphics is only one of countless things you could use this for. How about using this as a basis for cryptographic offload, or high-end audio, or wifi?
Re: (Score:3, Informative)
If you had RTFA you would have noticed that that's exactly what they have done !!
Re: (Score:2)
[Architect of this VGA implemention writing] I have tried implementing VGA text mode emulation where the conversion from text (2 bytes per character) to graphics is done on the host CPU. This is doable for DOS apps, but as soon as the system enters protected mode, the interrupt table is blown away, and your console stops. The only solution is to do VGA text (emulation or not) in hardware, because the host (BIOS, Linux, etc.) expects the hardware to have a certain interface and do certain things. We chos
Re: (Score:2)
Oh, and one other thing: Even in DOS, a lot of program (including the DOS command shell) completely bypass the int 0x10 API for drawing text, instead opting to write directly to the text buffer in the hardware. My host-only software solution was to hang off the 18hz system timer interrupt and just redraw the screen every interrupt. Of course, that goes away too when you switch to 32-bit protected mode.
Re:Hey (Score:5, Informative)
Cool, yes. Useful - hardly.
If you start with a clean slate, why would you bother with VGA emulation? Could you not just go for a sane solution, such as a flat frame buffer? Any other architecture does that, why does the PC architecture have to drag along legacy modes such as CGA with a number of 4 colors palettes?
A flat 8bit RGB buffer would make a lot more sense, and I am sure Linux would boot faster on it, too.
Re:Hey (Score:5, Insightful)
Unfortunately the BIOS and boot loader will still need VGA. Maybe Linux BIOS could remove that requirement, but you can't count on that.
They seem to have implemented it in a very cool way too. Quote from a linked OSNews article:
Re: (Score:3, Interesting)
For what it's worth, nVidia cards can do this just fine and have been able to for a long time. See the "full GPU scaling" option in nvidia-settings. My HTPC's nVidia card also shows the BIOS on my HDTV at 1080p link resolution (while pretending to be VGA to the software).
Re:Hey (Score:4, Interesting)
But it probably still uses a 8x16 pixel font, which doesn't look that good on a 30" screen.
I think the idea is that the video card could pretend it's VGA, while substituting an antialiased 32x64 font in its place. Nothing earthshaking of course, but that sure would look nice.
Your text mode could look like this [omag.es]
Re: (Score:2)
Why don't you get involved in the project then and tell them that.
Re: (Score:3, Informative)
Sure it does, the BIOS boot screen, settings, etc are kind of important to be able to see sometimes. The boot loader also counts on the VGA mode too.
Now of course you could use Linux BIOS instead, but then that adds the requirement to have a supported motherboard, and wanting to risk flashing it.
Re: (Score:2)
Yeah, you could.
But that non-x86 computer probably doesn't have a PCI slot where to plug that card in, so that detail doesn't seem to be very important.
Most computers where you can plug one into are going to start in text mode, and being unable to do things like changing the disk boot order in the BIOS would be quite annoying.
Re: (Score:2)
Sparc is a pretty open platform (open processor specs if I'm not mistaken and some open source processors available as well), it has PCI-support.
Re:Hey (Score:4, Interesting)
Ok, and how many people are going to run a desktop on it? It's server hardware.
Again, you seem to be missing my point. Yes, Linux technically doesn't need the BIOS. Yes, there exist other architectures besides x86.
But, a video card is a product for desktops, and the vast majority of desktops are x86. The vast majority of those start booting in text mode.
Pretty much all other architectures are unimportant in comparison, because they're used in embedded hardware, or are technically outdated. If anybody is going to buy this thing, I doubt they're going to put it into a modern Sun server.
It's already a project that's going to find it hard to get wide adoption, why would you make it even harder for it to find an use, by making it incompatible with the most common by far hardware it could be plugged into?
Re: (Score:2, Interesting)
I've used Sparc desktops [sun.com] in the past. I even used one as my main home machine for a while. You could even get Sparc laptops.
In their time they beat the Intel option imo and they are still in use in some places.
Re: (Score:2)
Ok, and how many people are going to run a desktop on it? It's server hardware.
That is really a matter of implementation, you can build a SPARC chip without expensive RAS stuff. See Itanium vs Xeon vs consumer Intel procs.
Pretty much all other architectures are unimportant in comparison, because they're used in embedded hardware, or are technically outdated. If anybody is going to buy this thing, I doubt they're going to put it into a modern Sun server.
SPARC isn't outdated, and I thought the other guy was talking about building new open desktops, not using existing server equipment. With all this talk of ARM being the next best thing for netbooks, I surprised anyone would take this stance against non-x86 architectures. What would you lose? Proprietary, binary x86 blobs? Hahaha, typically not a problem for the
Re:Hey (Score:4, Insightful)
You fail to take into account how fast things can change on the desktop arena. I say we have had enough of either BIOS, VGA and the text mode as such. For all it is worth, do it like Macs do - startup the minimalistic OFI/EFI with the video card in graphics mode, and boot up the OS from the disk blocks as fast as you can. If anybody wants to mess with their system before the OS loads up, they should press that Alt+Option+O+F or whatever that was, and type firmware commands into console. BIOS accomplishes neither task well - it gives experts stipid interfaces, while they could be using command line instead, and novices do not even know what they are doing in BIOS.
And no stupid 4-bit color Dell/Lenovo/HP/Asus/Acer logos with stupid BIOS text and even more stupid BIOS itself.
There is no need for two operating systems on one computer for the majority of us, BIOS being one. And it will save us 10 seconds of idle time at startup. Degrade the common subset of hardware interfaces so that the only thing the bootstrap procedure needs to do is get to the boot block of whatever device that contains the further loading code. No VGA BIOS and BIOS interface is needed for that in their entirety. Just a way to read the boot sector from a device. That does not need a vendor logo on the screen or the multitude of settings BIOS provides, before these are superceded with OS drivers anyway.
Re: (Score:2)
> Again, you seem to be missing my point. ...
> The vast majority of those start booting in text mode.
That is not even a point, that is just a fact. The point is whether you want your PC to start booting in text mode. I can see no reason why, and I certainly see no reason why you would want to spend silicon on it.
The BIOS is a terrible bunch of legacy code that should be eliminated rather sooner than later. Linux switches into a graphics mode as soon as possible, and from there on it is pretty independ
Re: (Score:2)
Well, sure.
Re: (Score:2)
That was exactly my point, I've used Sun Sparc-based workstations, they have the same RAM and PCI-busses, etc.
Re: (Score:2)
Just forgot to mention, there are already OpenHardware sparc-processors out there.
"LEON is an open source 32-bit SPARC-like CPU created by the European Space Agency. It's the standard CPU for the European Space Industry."
But the clock frequency is measured in Mhz, not Ghz at this point.
Re: (Score:2)
How about if you had a text interface that had some underlining, strikethrough, bold, italic and also colors. Also a help command that was like manpages, formatting content so you can view it better. All with commandline still, which in my opinion is quite a streamlined interface for administrative tasks. You will not forget commands if you have good documentation. I don't think it would take much space, if properly compressed.
How about, yes exactly, if you could run scripts and also store these to ROM or e
Re: (Score:3, Insightful)
Re: (Score:3, Informative)
Copy a Linux kernel to /dev/fd0 and it will so boot without a boot loader
Will it? How does the CPU know how to read the image off the floppy?
Re: (Score:2)
The ROM BIOS knows how to bootstrap directly from floppy. That's how MS-DOS used to work and the code is still there.
It is perfectly possible to replace a conventional BIOS with something like OpenBIOS [openfirmware.info] (a Free re-implementation of the standard firmware in SPARC and PowerPC workstations) or Coreboot [coreboot.org] (formerly LinuxBIOS).
One of my previous projects was a storage appliance, running Linux, which had a custom motherboard with a Peniutm III 1000MHz and 2 250GB SATA disks.
The ROM (c.f. "BIOS" on a PeeCee) con
Re: (Score:2)
So, at boot, the firmware initialize basic hardware so that it can do basic diagnostic and start the boot process of the OS
So wouldn't that be some sort of boot loader?
I only know of one common machine where no software was involved in the getting the bootstrap off the disk ;-)
Re: (Score:3, Informative)
Writing a bios specifically to boot straight into Linux i
Re: (Score:2, Informative)
We are proposing the same thing. It's just that you use the term BIOS instead of firmware. BIOS is a x86-centric word.
Booting linux without a BIOS have already been done, check that out : http://en.wikipedia.org/wiki/Splashtop [wikipedia.org]
Some Asus MB have this feature. My M3A78-EM have this feature, it can boot a small Linux distro in about 5 seconds.
Re:Hey (Score:5, Insightful)
Slashdot, RTFA, blah blah blah.
If you go to the Wiki, and read the link in the top article, there is a link to OS News. If you follow that and read down in the comments, you'll find this post [osnews.com] by the architect of the VGA emulation.
Apparently it really is emulation. Their MCU that they use as a PCI interface has a mode that generates the raw pixels when given VGA commands. It handles the VGA interface. The graphics processor just receives (from it's perspective) pixmaps that are constantly generated by the MCU in VGA mode.
The guy says that VGA on their card is actually resolution independent (since the MCU generates what is needed) and could actually be up-sampled to show clearer fonts without the OS having any idea it was going on.
He says it's not the cleanest way of doing things (from a methodology standpoint), but it has the least impact on the design of the hardware (compared to a "real" VGA interface).
Re: (Score:2)
re Linux Hater's Blog (Score:2)
Miguel de Icaza recommends the Linux Hater blog (Score:2)
"As usual, the Linux Hater Blog has some great commentary. Some of his feedback on KDE 4.0 applies to our own decision making. Worth a read [tirania.org] " Miguel de Icaza on 14 Jul 2008
This is the kind of commentator he recommends people read:
"I was getting a little worried that I wouldn't have something appropriate to close of K-pride week with, but
You took the optimists' point of view. (Score:2)
I don't take the optimists' point of view. I suspect shills.