# Sharing CPU over LAN



## OUHAC

I was wondering if there is any software where I can share my CPU power (or other resources like RAM) of other computer to 1 PC via LAN or other connections. If you know something where you can help me, plz let me know.

I know it already exist. I saw universities asking for your CPU power to share it to them so they can use it for their development of some project. But apperantly, using the technology 'distributing computing' (as they like to call it) for public users is not yet on the market. 

Business peeps, develop it man, you get probably rich :laugh:


----------



## aio

yes pc world bargain basement - ask their experts for advice.


----------



## johnwill

Why don't you describe exactly what you want to accomplish. While there are a variety of schemes for clustered computers and the like, many are very task specific.


----------



## Geco

I think I know what you mean. I used to be after the same type of solution myself. Until I took a second to think about it .

Personally, I was looking for a way to harness CPU and RAM power over LAN - preferrably without a HDD support and even without OS. So what I was after was something like a driver of some sort, that would allow multiprocessing capabilities over local area network in a reliable and simple manner.

What I had in mind was something like if a processor gets a burst of interrupts or demands, it shares the processes or requests over LAN and gives the job to another unused shared resource. What it would effectively do would be something like a software router that would simulate requested operations in one encapsulated package with demands for processing, while sending another encapsulated (VPN style) packet with data to be distributed to 'remote RAM'. So something like an interrupt and system bus sniffer would be used to distribute packages.

The idea might be completely far out on one hand, considering that it would most likely need a gigabit ethernet support for just some basic functions, I can imagine many stability issues and infrastructure demands. Yet given that the BIOS does support LAN installations of OS and if I remember correctly in Novell DOS version (with a booting chip on the board) also had the option of LAN booting the idea stands on a some not too far fetched ground. Also, with the BIOS support of USB 'thingies', the driver to share resources could be easily loaded.

I am not remotey educated enough to envision all problems linked to such an idea, but to think that one could use multiple processors and RAM like a single machine is most likely the dream of many people including myself. The application of such technology would probably be versatile. But to have the technology available at reasonable prices, it probably would have to primarily be useful for serious gamers.

With that in mind, it could be very useful to think that while our local computer acts as an os and distributing center, remote computer runs the game with it's resources. Further more, two remote computers could act as a graphic rendering locations, while a third one works with surround sound processing.

But like said. This idea is most likely far too complicated to be useful in real life, the issue of "real time" processing is questionable due to limitations in LAN speeds (just your typical bus and your processors work so fast these days that we could probably think only about using 286 computers) and even if we would use multi network cards array, it still wouldn'd be sufficient.

Optics could prove useful but that again would be problematic as it would demand creating new standards and if you want to live without too many headaches due to fan noise it would mean drilling even more holes in your house just for the cables.

All in all, even though like said this is every computer freak's envision of heaven, it most likely will never happen. Too bad though. An Opteron 885 dual core processor costs $2000+, while an Athlon64 DC 4800+ processor costs just over $600. Having four Athlons64 sounds like a better idea or at least owning multiple cheaper processor models, yet it is a whole lot worse in sense of LAN limitations.

So to be honest, you will probably not find any useful resources on the subject. And even if you would, they would have to be for a single core, single bus (higher quality multiprocessor boards use separate busses to link individual RAM slots to individual processors) and a 32bit processor, which we are all now abandoning.

The thing you mentioned though, was not a shared CPU power as you seek. You probably mean programmes like SETI and similar, where you have a stand alone application on your local computer (okay a remote computer to SETI people) which downloads data, processes it and then sends the results back to their rightful owners. But don't get confused with such solutions. They are in fact not really a shared CPU resource, but just a process carying node that does not work in real time. To write an api to use such resources is very simple. Just make a database which you fill in with your unprocessed data, and another one for results (you actually don't even need to do that but okay let's just say you do - otherwise you only need two tables). You create a user application, which you allow to connect to a specific IP and port to download and upload data, so you add some database module to your api and create a link. The rest is up to the computer as a whole, not just some parts.

There is however a solution of a clustered computers that use optics to share resources. Yet you most likely won't like the name of it due to your most likely financial limitations he he.

Bottom line, buy a Cray .

Just one more thaught from me, and then I'll finish  : "Every attempt, to find a workaround to costly hyper performance computer power, will allways result in a much more expensive solution to create and maintain such a system. If not in the physical sense, than at least in the means of power consumption over time". 

Although, if you're using Linux OS it can be done. But really... how many typical applications found around the world really run on Linux? I've been a big fan for a long time until I've seen how very wrong it all became. Unfortunately, even though I don't exactly agree with many Windows phillosophies, I have to admit that to this day, they are the best possible OS one can use (hehe Linux fans don't hate me for this, I can explain the logic in-depth, but as it is shown from this post, I talk too much as is ). Yet what you're after, you won't be running on XP - that I am most sure of .

I think you'll find sense in what I wrote. Go for the 4 way Opteron workstation. It will serve you well.


----------



## mickliddy

I just found this, and even though Im not exactly sure what the initial poster was trying to do, I thought it might be relevant to what Id like to find out about. I apologise for grave digging; but Im not quite sure I understand the above post correctly either.
Id like to use my two computors I have sitting here to share Graphics and Processing power, to USE THE SAME PROGRAM. eg. Have a game run that niether PC has the hardware requirements for, but combined they can run it fine. My LAN can transfer at 54Mb/s ... is that MegaBits or MegaBytes? Would that not be sufficient to achieve the task? What sort of transfer rate would we be looking at to achieve it? Even if you cant share the resources between the computors to achieve the same task, would it be possible to use them to run diff threads of the application?
Thanks


----------



## johnwill

Forget about it. Unless the graphics cards are in the same computer and connected by a special bus, there is no way to share the power. The traffic between the graphics processors is measured in gigabytes/sec for decent performance, you are several orders of magnitude away with any network connection, and even farther with a wireless connection.

This just isn't practical. Do a search on "clustered computing" if you want to share processing power.


----------



## mickliddy

thanks


----------



## JoHNiTyYy

Just think for a sec...you want to play some pritty game and you need a bit more of processor resource...but even you succeed to make that over lan "processing" u still have 1 ms delay which I think will crush the application imideately couse the wanted job from the main processor will delay with 1 ms and for that time the main processor will do it by itself.And this is when the computer is in the same room.Imagine that ather host is 100 miles away?It make it at least 20 ms delay(based on speedtest.net).So that will not work.I think it will only work with operations like brute...that need more time and main procesor takes half of the jobs and ather processor takes ather half but to put it together it will be a lot of gigas of ram. :laugh::wave:


----------



## johnwill

This is why organizations like SETI and medical research are done using distributed computing with thousands of computers. They can actually separate the task into many threads and dole them out. A game is a different animal.


----------



## ian_heath

I totally agree. I can see the obvious benefits for SOME applications (such as decoding/encoding huge files and complex 3d model building etc) but for gaming it would never work. Gaming has so much reliance on latency and the tasking/outsourcing management of this would likely result in poorer overall performance. 
Besides your wifi would already be getting "flogged" assuming you are connected to the web using same conn.


----------



## brian1596

I heard they link\networked computers in order to speed up calculations for graphics for Star Wars and Jurasic Park. How? I think they did the work and sent it in to be calculated. I am not sure.


----------

