bugz000.co.uk


26/09/2025

Gemini


PREFACE: i know gemini is not, and never will be "early web" and i agree this is what i want, and gemini will likely never provide this, i am also not an expert, just a hobbyist sat in his bedroom sharing his hobby with others, i'm just a guy, joe bob down the road, but hear me out. i was coding my own website, i've been home hosting for the past 20+ years and my website has gone through a variety of complete recodes, i don't use any fancy stuff, no dev environments or package managers, i code in notepad++ (this practice is apparently considered "oldschool" or some would say "stupid" by todays standards? i digress) i used to use php a lot but as my desires for styling and such grew bigger and i wanted more interaction, js become the solution... i was working with javascript and i found the need to geolocate the viewer as i had a 3d globe that showed all network traffic in and out of my network as great arcs around the planet much akin to icbm's ... and i wanted to show their own traffic in a different color, so i found various geolocation api's but the free tiers always had limits (ofcourse) - so i made a workaround where using javascript, i would simply have the users browser contact one of these free apis and report the result back to my server, i found some differences in the results however, so i ended up having users poll around 30 different geolocation api's, reporting them all back to my server and i would average the coordinates out... it was at this moment i stopped and realised how messed up the internet is... i can, without any prompt or question, just send a users browser to another services, have them fetch any information and send it back to my server, and though my practices are benign, nobody is questioning what is possible should intentions be foul? so finding this problem i stumbled on gemini, and a couple other older projects, i tried gemini but it is "too" restricting, i miss the internet of the early 2000s ultimately, where your website was your own style, it was your world, but it was static, no analytics, no cookies, no tracking, no scripts, no ads, no telemetry, just content, page, done i've been toying with the possibility of making my own protocol, perhaps a fork of gemini, the largest downfall of gemini is inline media, and i understand that goes against gemini's own philosophy, and some (one?) gemini browser does resolve inline media links to the frontpage too which is cool, but not a real solution i think there's a large number of people who's desires are in the middle of "modern web" and "gemini" - but for anyone else feeling the same, i found some resolve in using the plugin "noscript" - which is some behaviour i believe should be default in browsers, much like how flash and java used to prompt before you could run those scripts, noscript disables all scripting on pages and you can enable it for pages you NEED to use which are broken otherwise... it's a band-aid on the problem, but a quite effective one, however nuclear the solution may be i understand this is not a request or a suggestion, and the proposed "shortfalls" and philosophy conflicts with gemini, but the reason i post here is because i hazard a guess there's many more in this demographic which have the same frustration with modern internet it's just become a giant billboard generator with actual, valid, trusted, valuable information sinking far below "revenue generator" sites the likes of buzzfeed, i mean heck just trying to find a tutorial which isn't copy-pasted across 80,000 "tech news" websites and progressively typoed, rewritten, outdated and more it just makes it impossible, i find myself referring back to my old books than webpages these days it's so tainted, this is how i know we are going backwards... should anyone have a large repository of old sites pre-2006 available to download i would love a copy, i have mirrored a large ammount of old sites from an ISP that accidentally exposed a very old webhost server from the days before they were an ISP which i was very fortunate to stumble upon... and i have a gargantuan ebook collection... so if the internet cannot be "fixed" - the very least i can download enough archived pages to keep them locally! i do plan on making an online search engine for the sites i mirrored, so much like google you can search within these old webpages and ebooks for actual, valid information... so much is being lost for the sake of generating revenue it hurts

12/09/2025

Safety_Syndrome


Safety syndrome, a term i coined to describe the effect of marketing a policy as "for safety" but actually has financial incentives we see this everywhere, car licences, motorcycle helmets, seatbelts, warnings on tobacco product, as if any smoker smokes for their good health, and ofcourse including the most egregious modern example; the online safety act take motorcycle helmets for example, a motorcyclist not wearing a helmet is only a risk to himself, him wearing or not wearing a helmet exclusively varies his own chance of survival in any given incident; why do they mandate this? you honestly think the government cares THAT much for others safety? they'll willingly send their people to war at a moment's notice. the same thing can be said for all examples, in a land of individual liberty, it should be a choice, in every case the only policies that should be in place are the ones that influence others safety, not your own but when you follow the money, it's just another point to potentially invalidate an insurance claim, it's another point to deny health insurance, now we can call into question the age of the helmet, the integrity and repair of the helmet, where the helmet was purchased so on it all factors into whether coverage exists, despite the cost being covered by payments, which is exactly how an insurance company should work so it's not about safety, is it - it's about companies not having to pay out as much in the case of an incident, by molesting someones individual liberty to do what the fuck they want, so helmets are mandatory, that means every motorcycle in a given country now has to have at least one helmet in good repair, that's a helmet purchase every 4-6 years or so that's tickets if someone test rides their bike 200 yards and back and gets caught, it's flow of income for an entire logistical chain it has nothing to do with protecting people's lives, it's all about the money, and i have no interest in protecting corporate money but because these are marketed as safety, removing them is nearly impossible "so you WANT people to die???", they say. it becomes an easy retort for any argument leaning toward individual liberty, yet peice by peice, chip by chip, liberty is being molested the reality is, government and policy is meant to SERVE the people, not REGULATE the people the retort is totally moot, it's not about safety, because it was never about safety to begin with. that was just the prybar used to get the policy installed in the first place. in the interest of individual liberty, the policies should be removed.

29/08/2025

FOSS_Syndrome


Foss syndrome, a term i coined to describe the saltiness you will find in FOSS devs i get it, you put in the work, it was an immense ammount of work, and you supported people for many years, it's almost a full time job, and it may not be paying out the millions you expected, this is because being a FOSS dev myself, people don't pay for software, you think we'll all be cool and donate and help eachother out, it just doesn't work that way unless you are incredibly fortunate but you also despise paying for software yourself, this is the problem, you're marketing to demographic of people who are like-minded it's my belief all software should be free, because it's infinitely replicatable, ctrl C, ctrl V, that's another $50, it can't be good for the economy, and i believe this exact realisation is what contributed to the dotcom crash so as time goes on, the dev gets more burned out, more salty, they believe each individual user is taking something from them, the dev feels owed something, usability and "customer service" falls, and any form of collaboration becomes impossible due to the pride the dev has, he made it this far without contributions, why start now it always falls back to the same argument for feature requests and bug reports "it's FREE, you're welcome to contribute." - but that contribution rarely gets accepted it's the trap so many FOSS projects fall into, which is ironically compounded by an active dev who jumps on all bugs and issues immediately if you are a FOSS dev, my simple advice is chill, it's a hobby, keep doing what you LOVE and the money will come in the end, go take a vacation, the updates can wait another week, maybe then someone else will contribute enlist some staff, make your job easier, remember before you started your project, no such thing existed, it's why you made the damn thing in the first place from foss dev to foss dev, i appreciate your work, i see you, i see your work, it's awesome, but you gotta chill.

29/08/2025

Linux_Syndrome


Linux syndrome, a term i coined to describe the effect where someone with the intelligence and skills to make a service are often socially stunted, causing their UX and general ideas surrounding what is intuitive to be significantly skewed. it's no slight on them, they are undeniably geniuses, but we can't be an expert at everything, and UX with a sense of wider intuition is definitely a skillset in it's own right, but it is absolutely observable in the linux circles, and among various other fields it's perhaps most visible in my post about e-security, security is incredibly complex, and yet it's almost impossible for the leyman to employ actual security because UX is such a low priority in comparison, infact many would say it's outright impossible, it's a complex field, so it's going to be complex to use but to that i say; NGINX PROXY MANAGER. i've spent way too many hours of my life bashing my face against SSL certs all for me to make one change and the CERT becomes invalid again and it's another 12 hours of googling and mashing stuff into the terminal, it's horrible then i found nginx proxy manager, you simply tick a checkbox to enable SSL, agree to some TOS and IT JUST WORKS. if anything goes wrong with the cert, you can just click the button to request new cert, within a couple minutes it's working again - dead easy, dead simple, totally wrapped, now my website uses SSL, and i can't break it with my negligence anymore maybe it comes from a place where the creator truly believes it's easier than it is, who knows, but though my gripes mostly surround cybersec, if linux worked on it's usability juuuuust a little bit more, they could take the domestic market by storm it's funny how microsoft spent literally tens of billions of $$$ in R&D over literal decades across literally billions of devices in millions of use cases and settings across all age ranges and compiled this usability to a single guideline set, which is just there, free, practically public domain, you can just read it and use it, it's free to the world, it's been around for 30 years, and linux only in the last 5-10 years has STARTED to adopt it... part of me thinks the hate for microsoft fuelled that delay, and they simply independently come to the same conclusion albiet 20-30 years after everyone else who knows; my point is don't discount usability and UX - many devs consider it donkey work, they think they've above it or something, but it is critical i was told once, software that is too configurable will require the customer to program it, rendering it useless there's a happy balance, and i think linux syndrome describes erring on the side of "engineer made the UI"

29/08/2025

Priggard


A prig, someone who has an overzealous approach to topics of form and proprietary, but the official word "priggish" would refer to someone who is behaving this way i coined my own term, "priggardistic" - which goes against the form and proprietary, it's a handy insult you can throw at people, and it won't come up on google searches they may find their way to "prig" however, which carries the same weight so priggard, priggish, priggardistic, priggarded and prig, thus we have the world of doublespeak the language in english which is not defined in the dictionary, but follows enough grammatical rules to have an implied definition now fuck off ya priggard.

28/08/2025

VOIP


i've been looking for an open source/FOSS alternative to discord, turns out teamspeak 6 has just released their server files, but it's just curious... the dawn of the internet is up for debate, everyone has varying definitions, is fax internet? so on... there's certainly a few primary/widely accepted starting points, but really it was more of a "development" than a cardinal point one thing for sure though, it surged in popularity with the introduction of dialup modems which literally used phonelines... and yet, 30 years on, we still haven't perfected online voice comms what on earth is going on? lmfao

27/08/2025

Gaming Stagnation


remember gaming in 2006? if you don't, it was awesome ironically a reduced selection of games and platforms forced the community together, you knew everyone played OSRS, club penguin, any of the classics really it was exciting, it was new, people weren't just punching better graphics, they were innovating totally new concepts, it was the golden age of gaming - not THE golden age though... this happens every 20 years or so, where computer gaming will find a groove and surge but honestly i don't know what's coming next, i feel the gaming world has totally stagnated, so many games these days, so many platforms, so many options if you find someone who's a gamer, you bet your ass they likely never play the same game you play, and even if they do, they hop between games every few days i knew one guy who hopped literally every day, he'd play one game for about 4 hours and move on - i'm talking anything here, MMORPG's, RTS, FPS... like, my guy... how are you perpetually below lvl 12 in every game, just endlessly grinding starter quests, i never understood it finding someone with dedication to only a few games is so rare, and pretty unique to people in my age range oddly enough, but the issue still remains i believe this constant surging of new releases, sequels, revamps, "GAMENAME RELOADED!", all the other marketing bullshit they put out these days does nothing but harm their own interests by fragmenting an already severely fragmented community even further you should see the state of my notifications come christmas or new years, 500 games all throwing new years "celebrations" with "holiday content" trying to get me to spend my family holiday playing a fucking video game... just look at what cataclysm did to WoW, look what happened to call of duty i feel there's definitely a gaming bubble at the moment and it's in the process of popping but hey, maybe we can see actual innovation from this in the end... who knows...

23/08/2025

E[ase_of]-Security


i was messing around looking at authelia, a simple thing which is basically nginx reverse proxy manager but with a super powerful password/authentication utility on the front; demands SSL i saw a discussion between the dev and a frustrated user, where the dev basically said "i hear you, and uh no.", despite insisting he listens to all feedback he did however give a long explanation as to why he said no, which is fair until you saw his closing statement he didn't want to open vectors of making mistakes and it immediatley brought me back to my battles handling SSL certs and self hosted VPN's and other flavors of authentication hell my message is simple; security must be easy. it MUST be simple, it MUST be almost brainless to implement, this way no mistakes CAN happen the complexity of modern day security is so horrendous it is literally becoming a vector of attack itself, notice this page, my website has no form of authentication at all at some point either security is made easier, or thrown out entirely, honestly i think the whole user/pass is outdated anyway i find myself drawn to TBL's new development "pods" but they seemingly flopped (too much money in selling other peoples information) i don't exactly know how pods work, but in my mind this would be my ideal solution; basically its an encrypted datastore, standardised. you can host your own or pay a host, and instead of companies having their own databases they basically just get a unique hash for that data field you control so if you sign up for 30 websites, each has a totally different set of hashes, which refer to the same data, but they dont know what that data is, only that it's a valid field so sharing those hashes achieves nothing and you can validate and invalidate hashes as you wish, you store the verification that the pod is legit and the hash is legit make sense? maybe, maybe not, i hope it does catch on either way security MUST be easy.

22/08/2025

This_Page


On a lighter note; the design policy behind this page is to bring a taste of old internet back to demonstrate you don't need gigabytes of various frameworks, sql, rest, react, javascript, whatever else to make a nice page that functions well as such this page, though generated with php, is pure flat html, just text, even if i include images, they will be links this is paired with the eventuality that i will be also hosting a version of this page on BBS! i've been hosting for many many years, i remember a time before the internet, i remember pre-corporate internet, i remember corporations taking interest, and the mess we have now but i'm seeing a resurgence of old internet spirit with the clippy movement let's hope it goes somewhere, i don't WANT to make innernet, but it is surely looking like an inevitability the thing people forget is that the internet is not "BROWSER" it's simply a communication channel - anything can be sent over it so the tighter they grip browsers and apps, the more likely it is an alternative will appear this has already happened with the likes of gemini; now to repost a prior post i made; PREFACE: i know gemini is not, and never will be "early web" and i agree this is what i want, and gemini will likely never provide this, i am also not an expert, just a hobbyist sat in his bedroom sharing his hobby with others, i'm just a guy, joe bob down the road, but hear me out. i was coding my own website, i've been home hosting for the past 20+ years and my website has gone through a variety of complete recodes, i don't use any fancy stuff, no dev environments or package managers, i code in notepad++ (this practice is apparently considered "oldschool" or some would say "stupid" by todays standards? i digress) i used to use php a lot but as my desires for styling and such grew bigger and i wanted more interaction, js become the solution... i was working with javascript and i found the need to geolocate the viewer as i had a 3d globe that showed all network traffic in and out of my network as great arcs around the planet much akin to icbm's ... and i wanted to show their own traffic in a different color, so i found various geolocation api's but the free tiers always had limits (ofcourse) - so i made a workaround where using javascript, i would simply have the users browser contact one of these free apis and report the result back to my server, i found some differences in the results however, so i ended up having users poll around 30 different geolocation api's, reporting them all back to my server and i would average the coordinates out... it was at this moment i stopped and realised how messed up the internet is... i can, without any prompt or question, just send a users browser to another services, have them fetch any information and send it back to my server, and though my practices are benign, nobody is questioning what is possible should intentions be foul? so finding this problem i stumbled on gemini, and a couple other older projects, i tried gemini but it is "too" restricting, i miss the internet of the early 2000s ultimately, where your website was your own style, it was your world, but it was static, no analytics, no cookies, no tracking, no scripts, no ads, no telemetry, just content, page, done i've been toying with the possibility of making my own protocol, perhaps a fork of gemini, the largest downfall of gemini is inline media, and i understand that goes against gemini's own philosophy, and some (one?) gemini browser does resolve inline media links to the frontpage too which is cool, but not a real solution i think there's a large number of people who's desires are in the middle of "modern web" and "gemini" - but for anyone else feeling the same, i found some resolve in using the plugin "noscript" - which is some behaviour i believe should be default in browsers, much like how flash and java used to prompt before you could run those scripts, noscript disables all scripting on pages and you can enable it for pages you NEED to use which are broken otherwise... it's a band-aid on the problem, but a quite effective one, however nuclear the solution may be i understand this is not a request or a suggestion, and the proposed "shortfalls" and philosophy conflicts with gemini, but the reason i post here is because i hazard a guess there's many more in this demographic which have the same frustration with modern internet it's just become a giant billboard generator with actual, valid, trusted, valuable information sinking far below "revenue generator" sites the likes of buzzfeed, i mean heck just trying to find a tutorial which isn't copy-pasted across 80,000 "tech news" websites and progressively typoed, rewritten, outdated and more it just makes it impossible, i find myself referring back to my old books than webpages these days it's so tainted, this is how i know we are going backwards... should anyone have a large repository of old sites pre-2006 available to download i would love a copy, i have mirrored a large ammount of old sites from an ISP that accidentally exposed a very old webhost server from the days before they were an ISP which i was very fortunate to stumble upon... and i have a gargantuan ebook collection... so if the internet cannot be "fixed" - the very least i can download enough archived pages to keep them locally! i do plan on making an online search engine for the sites i mirrored, so much like google you can search within these old webpages and ebooks for actual, valid information... so much is being lost for the sake of generating revenue it hurts

10/08/2025

Bias


15,000. that’s it; that’s all it takes to completely consume your comprehension if you met one new person every two days from the moment you were born, you’d still die without meeting that many. that’s more humans than you will ever meet in your lifetime, more than your brain can physically comprehend so when 15,000 people are all saying the same thing, it's easy to fall into the trap where you consider that thing to be what "everyone" is saying your brain treats it as global truth. yet with the prevalence of online communities, 15,000 in a community isn't anything crazy, it's a decent number sure, but not unobtainable by any means, specially if enough are looking for the same thing this fosters a society where people are indiscriminately validated - it is an uncomfortable act to admit you are wrong it's easier to simply find an alternative community that says you're right, and there's room for everyone online. people are not required to face being incorrect. it's literally how trends work, how things go viral, how subcultures form, it's the foundational base behind all fandoms and communities it’s truly the tipping point of human comprehension, there's several problems that arise from this, news stations know how it works, google knows how it works, reddit knows how it works, anything with an algorithmic feed will be fuelling whatever bias you have, and sidestep your cognitive dissonance, or perhaps more pressing; they can intentionally apply pressure with an efficiency like never before... consider the implications here. it's raised the bar on the kind of person required to accept that they're wrong, becuase not only must they accept they are wrong, they must accept the 15,000 others they've seen are wrong, every person they've ever met, they need to accept their ENTIRE reaility has been artificially shaped, and it goes a long way to explain the mentality we see in people today this effect is only compounded with the internet holding all information for them, if it's not online, it doesn't exist, and if it's on a website, it MUST be true! i can put anything i want on this website, literally anyone can edit wikipedia, Did you know Chinese people are actually from the moon? they have lunar gods and everything i promise look it up! you can put any crazy shit you want online and nobody questions it, and old wisdom that HASN'T been comprehensively documented online is disregarded, this is terrifying and dangerous never in history have people been so consistently and confidently incorrect and wrong, with absolutely no means, motivation or incentive to change never in all history have people had so much information, yet be acting so consistently fucking stupid it will be interesting to see what this ends up being, at the very least, the internet sees all, and records all, if only i could live 600 years to see what future humans draw from this era, the first generation immortalised on the internet, the most granular historical archive ever created and we're acting like idiots

09/08/2025

Autoimmunity.


For all those with autoimmune disease Ruled by the incorrect perceptions and assumptions of others, In a world where intolerance, impatience, and misunderstanding prevails, forgotten by most. Cast aside, ridiculed, mocked, doubted It's more than just the illness. It's a comprehensive perceptual shift, A catastrophic bend in a life's trajectory. Nobody asks for this. When you doubt others as much as you doubt yourself because you've always been doubted, you lose your place to belong. But home still exists, it is with solidarity to those who were also scorched by the same fire. Don't let their projections define who you are. You are diseased, but you are still Human. They won't end this cycle, they haven't had their reality shook to it's core, they can only seek their own comfort, as they cannot understand you, you must accomodate their truncated phaneron. They've never had to gaze upon cosmic horror and question the fundermental. But you? you are built for this. For better or worse, for wish or want. The honor is yours, and for this i thank all who stand up in defiance, I see you. We can make the world a better place https://www.youtube.com/watch?v=s_nc1IVoMxc

08/08/2025

The_Innernet


The internet isn’t yours anymore. it's a tool for surveillance, tracking, analytics, ads, dns spam, scripts, profiling, ai, algorithms, monetisation, "download our app!" it's crazy we load megabytes and megabytes of script to display 12kb of actual data, its absolutely wild we just run code on our browsers indiscriminately, maybe you know what javascript is capable of... maybe you don't - but it should scare the crap outta you what ever happened to asking before running code on someone elses computer? a simple courtesy gone? ever looked at just how many requests your browser makes loading a simple page? most pages poll over 200 other domains! what the actual fuck? we used to type random domains into the browser, not google everything we'd explore and follow "links" page on each website, they had a damn landing page that stated how many kilobytes of data were on the other side and a link; "ENTER" if you wanted to these days if it's not on google, it doesn't exist, and they've even started removing old indexes and cache, which is catastrophically sad, given these are literal kilobytes of information, they'd rather guzzle 2 weeks of footage a second on youtube of ai generated tripe. you think this is free? it comes at a cost, your privacy. your information is making other people millions of dollars and you see none of it - infact you're CHARGED for the privilege of having your information taken and shared with the one resource you can never get back; YOUR TIME HOW ARE WE OKAY WITH THIS? ultimately the internet has lost it's soul, what made the internet the internet was the people, the creativity, and sure there's pockets of that around i get it... but if you never saw 90s/2000s internet you simply wouldn't understand it was a whole world, you'd set up a site and it felt like you were talking to the whole world, getting 1000 views on your website was BONKERS, now you get 50,000 views and it's nothing. there was innovation, passion projects, it was just nerds sharing cool shit they made, it was truly a golden era, and we can take it back. the thing is the internet is just a means of communication, before internet it was radio, before radio it was letters, we've always had this innate desire to connect and be creative then the corporations found it and as usual fucked it up... i remember pre-corporate internet, they weren't interested, i distinctly remember saying the internet is the world's last true freedom... now the internet has become entirely corporate and i mean sure, the internet is for everyone this means i must accept corporate existences too... however, participation should be optional... this is not the case currently. infact it's being ENFORCED. participation in corporate internet is fast becoming mandatory, and already is in many many places. so here’s the idea: we make a new one. crazy? maybe, but we literally have the technology already, nothing needs to be made, or developed, we just need the people, and i know you're out there somewhere it's not a new website, nor is it tor, though that could be used... a parallel internet — disconnected, private, hand-built, homemade, passionate no ads. no tracking. no corporations, no money, just nerds doing nerd shit i call it the innernet. some (mine) invite-only, some open some self-hosted, some cloud based (not my pick, but whatever) no vips, no patreons, no growth hacks. no algorithm no corporate bullshit. just people — building things, running services, and keeping the old web spirit alive. you can make it decentralised, you can make it centralised and control it, you can join multiple innernets together, there's no fixed specification, again, i'm just coining the term you don’t “browse” the innernet. you join it's ecosystem in my mind each innernet has one public-facing portal. that’s the front door find it on pastebin, via a friend, through whispers, who knows, it doesn't matter once onboarded, that same url becomes a town hall of sorts, maybe it's an index page, a google copy, anything, it's the frontpage of that innernet just people showing up and sharing what they love, everything inside is member-hosted for my innernet, no scripting allowed, any javascript you'd be punted from the network, just flat markup the best part is dns, once inside an innernet, the outside internet ceases to exist, so google.com as a domain is available, and handled by one of the users, try to browse to youtube? doesn’t even work, it’s not part of this dimension it makes people matter again, it makes your efforts matter again, and scatters the community across niches where corporate has no control, where your presence isn’t swallowed up by an algorithm or reduced to “engagement,” but noticed, appreciated, shared, felt, and missed when it’s gone. as such the internet, the big one, becomes what it should have been a SERVICE to us, a weird, noisy rss feed you occasionally poke your head into, but not quite each innernet should relay traffic at a gateway or portal to the internet, so we become the scrapers, not them that's the fun thing about information, the same thing they abuse to share our information, we can use to take information from them information is non quantum - this means once it's viewed, it's viewed, if i give you a CD with a text file on it, i can say i gave you a CD with a text file on it, i can't say if you've given that CD to other people, if you've saved that text file to your pc, if you've copied it 1000x i have to assume in all cases at all times even if you just threw it away the moment you got it, that you have the text file in some manner if storing and replaying is stealing, then what is my browser cache... and if i'm allowed a browser cache, why cant i just... keep caching... and instead of browsing myself... why can't my computer browse for me... and you get the idea. as such, our bots scrape once, we count as one view, and it's saved, mirrored and replicated internally, free of analytics and ads maybe your concern is a dramatic reduction in services and content? the internet is so vast, you could spend 80 years online, 24/7, and still see less than 2% of it, this doesn't count the weeks of youtube watchtime uploaded each hour (yes, really, i am not joking.) does it not foster a sense of futility? even 0.0002% of the internet will be enough to fill your entire life with new information in the end, the internet has lost it's soul, because we were it's soul, even the smallest slice of the innernet — built with care, curated by weirdos — is enough to be overwhelming, inspiring, and worth preserving, and we're at a point where we have the tools to replace most of the internet entirely storage is cheap, domestic bandwidth is high, machines are fast, and we don’t need permission... yet (as for those ISPs that say you can't host, fuck em, do it anyway, it's all encrypted traffic, good luck) honestly it would only be a matter of time before such things would be made illegal one way or another, but it's community that made the internet, certainly before the hordes of orcs found it, it's important a community is founded prior to any regulation. we literally have the technology, right now. LITERALLY right now we can do this. we just need to do it. i think if i feel this way, there must be others out there who feel the same way... the question really is "how many?"... for me at least, membership to the innernet requires one rule: bring a service. if only out of principle to sustain the concept host a blog, a gallery, a textboard, a mirror, a bbs, a planetary sim, a file dump, anything if you want wikipedia? mirror it if you want youtube? mirror it chances are there's someone in the innernet who has already done so the innernet is for sharing, not for consuming, if you’re joining to just consume, go to the internet, that's what the internet is for these days, mass, gross, disgusting levels of consumation. for us the internet becomes a resource for scraping and pulling into our inner ecosystem, not the fundermental infrastructure we depend on the details what goes on within the ecosystem is none of their business, they only see our scrapers everything is local dns? ours auth? ours cdn? ours every innernet is a federated island some are tiny, some are massive some link up, some stay dark some die, some thrive some mutate into entire cultures and the timing matters, remember when the uk pitched a “grid” — a government-hosted internet with approved websites, subscription packs, and total editorial control? not many do... it was kinda erased from history... just long enough that the recent "online safety act" scares the absolute shit outta me, and now with the bullcrap surrounding internet safety act, just what the hell man... sounds dystopian? we’re halfway there app stores tell you what apps are allowed isps log everything smart tvs watch you ai trains on your art without asking this isn’t fearmongering it’s just the trajectory i've been saying it for years go ask anyone who owns an "eternal september" tshirt how they feel about it... so let these corporations have their internet, it's full of lifeless bots anyway we’ll make our own. if you’ve ever mirrored a site “just in case”? if you hoard vines, pdfs, roms, logs, old art? if you self-host out of spite? if you miss geocities? if you still run irc? if you’ve ever built something weird, just because it was cool? then you’re already one of us? there’s no launch, no spec sheet, no roadmap, no crypto, no discord no signup just whispers, home servers and a stubborn refusal to let the old web die. calling all data hoarders.

03/03/2012

ALU!


Hello, i have been looking through some old logs and images on my server backup and found a ton of images of the development of my ALU in minecraft :) although the images may be small or slightly distorted - they do the job, they aren't the originals. either way - here goes! i started development making the smallest fully functional full adder (AND SUBTRACTOR) in the world - basically i copied the worlds smallest full adder and smacked a not? (i think) gate on the top - it was a long time ago. after this i simply copy/paste them in a line with the subtract "rail" running above the inputs. the Ci Co are right next to eachother between the adders - the output is at the back :) [IMG] [IMG] [IMG] i even put a little name tag on because at this point i released the map/adder online on the minecraft forums. [IMG] So then it began - filling in all the cables - in a huuuge structure - putting repeaters every 15 redstone blocks (15meters) [IMG] [IMG] [IMG] [IMG] [IMG] folling in ALL of the rails would be exhausting, but then i come across a problem - i work in binary backwards - so 101 at the furthest RIGHT is 3 to me meaning i had to flip the entire i/o over - not tricky when each "rail" actually takes up 2 meters of space vertically - nevertheless i made it. [IMG] DAMN RAIN! D: [IMG] just starting to make it - the start of it (under the top rails) is far overcomplicated [IMG] after simplifying the start - i finished (32bit) [IMG] another view of both 32bit flipped [IMG] here is a side view of it [IMG] The ALU is already far too big to be seen within the view distance of 15 chunks? (i think) - standing at the center of the "flip" circuit - i can barely see past the part where it corners. so i had to run Cartographs to show the full circuit at this point i start creating the full 64bit [IMG] above is 32bit - below is 64bit after i moved the "flip circuit" to the new center and started making new "flippy rails" to bring in the new inputs [IMG] [IMG] here is the 64bit "flippycircuit" [IMG] [IMG] [IMG] they are offset right? yup - this is because the inputs of the ALU have two blocks empty between them (they are offset by two blocks) :) [IMG] see what i mean about barely seeing past the corner? [IMG] SO! inputs complete - i start putting power to some inputs - suddenly - (Dramatic music) a problem occurs that kills the WHOLE project :( the ALU is simply TOO BIG! the redstone just STOPS after a few chunks - considering this was hosted on a 800mhz Pentium 3 with 512mb system ram - no wonder! nevertheless - i got my thinking cap on :D [IMG] Basically what happened here - i cut a huge hole in the floor right down to layer 1 because i was about to put the whole ALU on the Vertical axis - i copied 16 of the adders- pasted them in a line- then pasted another line ontop so on- making 4 lines of 16 - 64 bit! :D i then copied the "flippy circuit" over (note also - each time i copy/paste with the low ammount of ram - it always corrupts leaving huuuge "gash" like structures of "air" in whatever i am pasting - quite annoying. [IMG] after joining them up like above - i hit a problem where the 2nd set of inputs could not reach the ALUs! :( so i had to rethink and make below! [IMG] in the end i settled for this - huge rails curving around the left edge and back (for organisation) [IMG] i tried to do it qickly but it resulted in this - redstone EVERYWHERE lagging my client - threatening to corrupt the map! :( [IMG] [IMG] [IMG] Regardles - i connected the rails up to the ALU - but i hit a problem with the inputs - so i had to break those and replace them [IMG] So here's a few pictures of the ALU in it's current state (Generally) [IMG] [IMG] [IMG] Now - another problem - i forget what the hell it was LOL! but either way - it involved me having to flip both sets of inputs - then flip the output over O_O it could probably be done without flipping anything at all - but it's done now - i'm not going to rip it down after the hours i put in XD here's the pictures. [IMG] i got rid of the "flippy circuit" and pasted it twice above - dropping out "outputs" of that circuit down below it - i then run rails straight across back to the "start" - dropped them down back into the big circuit that curves across to the left *breathes* O_O [IMG] [IMG] [IMG] [IMG] [IMG] [IMG] [IMG] [IMG] INPUTS DONE! :D (NOT WIRED UP YET!) now for the output- i started with moving the outputs across 4 blocks to line up with the rest of the ALU (no pictures sorry :( ) i then raised the output to RIIGHT above the entire ALU - then moved them all across to their respective group of 16 (i'm moving 16 x 4 to 1x64 - so it goes layer 1 = first 16, layer 2 = 2nd 16 - i'm sure you get it) [IMG] [IMG] [IMG] [IMG] looking pretty big huh? still lots to do! i moved the outputs 16+16+16+16 to the end of the "flippy circuit" protruding from the top (pushed out to make room)- i dropped these outputs down and back in - then down again to hit the Output that you read! :D [IMG] [IMG] [IMG] [IMG] [IMG] [IMG] [IMG] [IMG] [IMG] [IMG] [IMG] [IMG] [IMG] amazed? i am - 90% of this was built BY HAND! - not using worldedit or other macros :( although - STILL NOT WIRED UP! this is just the FRAMEWORK! either way! the Adders needed the Ci Co and SUB connected between the layers - the best way i could do this was simply a sprial design! [IMG] [IMG] As for the adders corrupting during the pasting - i pasted a single one right out of the adder and fixed it - using this one as a model for all the others - i manually fixed all 64 of them [IMG] heres two cool pics [IMG] [IMG] there's that many rails that it blocks out ALL light below in some places :D and for now - the end! - i am working on wiring it all up - with help of a friend nameless-exe - we have spent around 4 hours wiring it up and are around 50% complete - i HAVE actually got TWO i/o all fully wired up for testing and a model - those work and take around 15 seconds to process! that is simply because of the extortionate I/O. i'll keep you up to date :)

01/20/2011

Cluster Computer


Hello! The first post on my blog! Well, I'm not usually one for Blogging, but this is a special occasion as i have finally launched a project to build a home cluster computer. (note i have decided to keep updating this one post as to keep it in order, makes it easier to read ;D ) Now you may ask what a cluster computer is, well, ill briefly explain it, its where you get many smaller computers, wire them up through a network and then use special software to make them talk to each other, which creates one giant computer! you can call it a Super Computer though. The launch of the project involved me contacting the college as i need a fair amount of computers to make a cluster worth using, and after various emails and meetings, it was arranged that they would help fund this project for me which includes two network switches, and 10 nodes. Today was an exciting day because i was able to pick up the two switches, but not the nodes as the college are still "scrubbing" the hard drives, but i have to say, these network switches are quite mean machines compared to my previous 5 port switch i had off ebay for £7 (even the sticker said "5 prot switch" yes, the sticker was typo'd) These new switches recently surprised me as i discovered that they have NAT loop-back, or something similar, because i would usually have to connect to my existing server through Hamachi because a direct LAN ip wouldn't connect (the server was only designed to accept WAN or "outside" connections), but when the Hamachi servers went down, i completely forgot about the need for a virtual LAN interface, and just connected via a straight LAN IP thinking that i was dumb for using hamachi, then it dawned on me that i had used hamachi because i couldn't do exactly what i was doing! [IMG][IMG][IMG] i currently have one of the new switches in place of that ready for the cluster to be connected to the other switch, and then i will pick up a really high quality cable to run between the two switches as my main PC will be the master node to the cluster. [IMG] i have been reading up on cluster computers, and it turns out you can pick up a stack of original Xbox's (730mhz) and load linux to cluster them together which give me a crazy idea as the Xbox cluster (seen here http://www.bgfax.com/xbox/home.html ) seems to have sparked a bit of excitement in the cluster computing community, im yet to see a cluster made of the PS1, and i know i have around 3 of those gathering dust in the attic, alongside a Dreamcast, and a Sega Saturn so it is tempting to try and load any distro of linux on them, because if i can get ANY linux running on it, i can then run a virtual machine on them which can run the cluster distro, which i can then attempt to develop into a native "dreamcast cluster distro", but either way, getting consoles into a cluster will be quite cool and may look bizarre with a stack of PCs then an Xbox ontop all running together as one PC. i have an old 800mhz laptop here and a 2.8ghz pentium 4 PC with fried RAM here which i can also integrate into the cluster, although i do want the cluster on it's own electrical circuit and network switch to help keep any problems with power/networks isolated. at the moment, my PC (2 PSUs under heavy load), server and switch are running on one circuit while the fan and light are on another, but i guess adding another 10 nodes onto this one circuit may blow a few fuses so best be safe :) as for cluster rendering with Sony Vegas, the version i updated to dosen't support cluster rendering which has irritated me a bit as i make many videos (http://www.youtube.com/user/bugz000?feature=mhum) and the rendering of these can range between 20 mins to over 6 hours depending on the complexity and the quality, such as some of my CoD4 videos where i render in full 1080p HD (using 1800 1600p HD source files) with sharpening/color tweaks/transitions on all or most frames, so i will probably rollback to Vegas 9 again to render across the network. rendering a video with vegas isn't like parrallel CPU processing, it works by say.. 10 nodes, it will split the timeline into 10, and send that part of the timeline alongwith the source files it needs, and that computer will add any requested effects/transitions then send it to the PC that you allocate to do the final encoding which could be your main one, and then once all the files have been sent back, the allocated PC will "stitch" all the files together and there you have it! a render cluster at work. although this will be bottlenecked by the 100mbit connection but these new network switches with their gigabit connections, sending a 30mb file should be rapid (final size being 300mb) -8th feb 2011- after much lifting and hauling of computers, alongside photo-shoots with a gorgeous model -flicks hair- i have finally managed to get the computers into place! (photo of a news article below, 11th feb 2011) [IMG] The college had also provided me with a mouse and keyboard for each, also a set of power cables which i have plugged into each node. the only thing which i need now is a few "extension leads" or "multiplug sockets", i call them "gang sockets" and ofcourse, Cat5 RJ45 cables! [IMG] -10th feb 2011- after posting that i need Cat5 cables on this blog, i managed get a set of short patch cables that would usually run between a patch panel and a switch for free from where my mum works, but i needed at least 1 more cable to run between the two switches, which thankfully, the college were able to provide for me, so i now i have the cluster all wired up, except for one vital element... POWER! spaghetti anyone? [IMG] i put the switch down behind the cluster so the small patch cables can reach [IMG] you can see its quite a mess behind there! notice the grey cable leading off to the left, thats the cable running to the other switch, i had to run it under the carpet and tug it a bit but it finally reached the other switch with slack to spare as you can see below [IMG] the dark grey one goes towards my cluster, the blue one goes to my pc, the yellow to my server, and the white one goes under the floorboards to the other side of the house into the router (i call that one the lifeline cable) [IMG] here you can see my whole PC before any cluster computer development took place, the grey laptop infront of my PC has been shut down as it is 800mhz, but i may be putting this into the cluster at a later date, notice also instead of the old switch, or the new one, i have a wireless router acting as a switch on split horizon DNS which i give up maintaining as it kept breaking down and having random problems, maybe i didn't do it right, either way i have a badass switch in it's place. (sidenote - its been a long time since my desk was that tidy! i think i need to clear it again xD) -12th feb 2011- LET THERE BE LIGHT! thanks to some help from dad, i have got the cluster powered up with spacers between them to keep an air flow! now to get myself a stack of CDs and start burning! [IMG] -21st feb 2011- i have been thinking on how best to implement this, as i may need to have a keyboard attached to any computer at any given time, and i do have a mouse/keyboard per PC, but no screens, although a mouse/keyboard per pc in this small room would get incredibly messy. i have been thinking of running a KVM system as my main screen has a built in dual input KVM, but as i was researching, i remembered i hadn't even got the software downloaded! i was planning to use OpenMosix as a process migration interface (not sharing threads, it is where the thread of data is shared across the whole network with no master node, meaning you can have unlimited users, on unlimited computers, and it will all work as if there was one person using one PC, sharing the load of all the users across the entire thing, genius!) downloading the source code, i was about to burn it to CD when i thought, is this how it is supposed to be done? burn to CD or is it a kernel tweak? i browsed to http://openmosix.sourceforge.net/ to find "The openMosix Project has officially closed as of March 1, 2008." ... DAMN although reading futher i find a project called LinuxPPI which claims to have continued Openmosix development, and they also claim that Openmosix was a failed project, meaning it didn't work? either way, talking to my tutors, they were fairly adamant that it is possable to parralel cluster on a Windows OS, so reading further into this (the internet knows EVERYTHING! its crazy!) i discover MPI! Message Passing Interface, coded in C, it is a base for programmers to create applications for a cluster computer! although all of these applications are usually for huge companies who pay for the development, so it is not released at all. although there is one application that i have found that aparently generates prime numbers (using a fairly clunky method, but it works) and it's on MPI! https://computing.llnl.gov/tutorials/mpi/exercise.html Reading into this further, and reading this code https://computing.llnl.gov/tutorials/mpi/samples/C/mpi_prime.c i notice this:
#define LIMIT     2500000     /* Increase this to find more primes */
#define FIRST     0           /* Rank of first task */
meaning i can essentially have each computer generating a seperate chunk, 0-2500000, 2500001-500000 and so on with no communication between them, cutting out the problem of the bandwidth in the network, although we are working with concepts here, that would defeat the point of a cluster! so the next step is to get a copy of Windows on each PC, and get installing MPI. -23 July 2011- Wow its been a long time since i've done any progression with this, coursework was a priority but now i have some more spare time, i have been programming allot in AHK (derivative of C++) and i had an idea as i use this language to interface with my main website www.bugz000.co.uk the genius idea i had was instead of having to learn an entire new language, JUST USE THE ONE I HAVE! although i'm not one for math, i asked on the IRC for assistance and Rseding91 went out on a limb and just created me a function to generate prime numbers! :D! msgbox % Find_Primes(1,100) Find_Primes(_Min,_Max) { If (_Min > _Max) Switch := _Min, _Min := _Max, _Max := Switch Current_Num := _Min While (Current_Num <= _Max) { IsPrime := Current_Num "|" Loop % Floor(Sqrt(Current_Num))-1 { If !Mod(Current_Num,A_Index+1){ IsPrime := "" Break } } if (isprime) { traytip, Primes, Found Prime! %current_Num%, 10 } Primes .= IsPrime, Current_Num ++ } Return SubStr(Primes,1,StrLen(Primes) - 1) } it's working brilliantly, generating right now as i'm typing into this textbox, i'm on 1632*** (numbers scrolling past too fast) either way 7 digit numbers, it's been running a mere 5 mins or so! the next thing is to get this loaded up to work across the network, having my main computer as a master node to sync them all and gather results, shouldn't be too hard (i hope) i still need to get round to installing 10 copies of XP :o Jake` on the IRC just foundthe current largest prime number in the world, generated in 2008,http://prime.isthe.com/no.index/chongo/merdigit/long-m43112609/prime-c.html 13 million digits long.. HMMM.... these guys got $100000 for generating that (they use a global computing method) quick update, this is the most recent pic of my computer so far :) [IMG] i have two servers now, one hosting Minecraft which i'm building a huge (60 meter tall) 64bit ALU with full I/O (soon to have seven segment output and "numpad" style input, should be fun) -29 January 2012- Well it's been a long time since i updated this yet again - but this time i have been busy making the actual code! WOO! i got the winsock working - and i was generating primes over the network - the only problem is that when a connection is broken - ANYWHERE in the cluster - the ENTIRE cluster program (on every node) will close, i've tried my best to stop this happening but to no avail, so i will have to look more into the winsock program - the problem is that i am looking to make this base modular so i can write new scripts to run on the cluster and not have it to JUST generating prime numbers. so i thought IRC! - i loaded a server and pulled the IRC bot "proof of concept" script from the forums - modified it and i am currently trying to get the main server to log down which nodes are responding by outputting a command which triggers a command on each node - the node then outputs another command which triggers the server to update an array containing all the names of the bots responding - GENIUS! although one catch - even though this is on LAN - they all respond at the same time, the "server" bot only recieves two, so i am looking to stagger them somehow, currently i am doing a sleep of a random var between 1ms and 10000ms (10 seconds) , although this still has a chance of spamming out more than two at a time, so i am starting to lose hope with the IRC idea and simply go back to the Winsock, and to have another program to check whether the main program is open, although i do remember seeing something on the forums about an "ad-hoc" style winsock client/server script - which basically only opens the connection to send the data - then closes it, i should try and find this script and put it to use :) -30th January 2013- Wow! it's been a YEAR!!? i have been busy! well quick update- i have made a few programs in AHK now - including my B_PAGAN art generator that uses WINSOCK! (AHKSock to be precise) :D i finally got to grips with it!!! AHKSock is a wrapper for Winsock (as it's terrible) - and i've wrote myself a server/client program that is a wrapper for AHKSock and it works flawlessly, so happy! so with my new technology i've spent the last day or so programming the cluster program! Here you can see it working :) as it is right now- the program is fully working but has a few bugs these include; - if the data is too long - it sends in chunks in "rapid fire" which overloads the server - if more than one client returns at the same time, it can't handle it both easy fixes :) i'll just implement a 500ms delay in chunk sending and as for client "synchronisation" - i will have the clients simply wait to be "called" for their data - simple as that! :) so here's what happens Server: Hey im ready! :D Client1: Connecting :) Server: New connection! Client1 (update active socket list) Client2: Connecting! :D Client3: Connecting! Server: New Connection! Client2 (Update Active Socket List) Server: New Connection! Client3 (Update Active Socket List) -Start is pressed- Server: Okay who is still here! Client1: Here! Client2: Here! Client3: Here! Server: Okay i have active client list Server: Doing math Server: Math complete! Server: Client1 Do this chunk x-y Server: Client2 Do this chunk x-y Server: Client3 Do this chunk x-y Client1: Completed! Client3: Completed! Client2: Completed! Server: Active Client List is Complete! Server: Client1 please send me your data Client1: here's the data xxxxxxxxxxxxxxxxxxxxx Client1: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Client1: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Client1: Data Transfer Complete Server: thankyou client1, you may close now Client1: Closing! Server: (update active socket list) Server: Client2 please send me your data Client2: here's the data xxxxxxxxxxxxxxxxxxxxx Client2: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Client2: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Client2: Data Transfer Complete Server: thankyou client2, you may close now Client2: Closing! Server: (update active socket list) Server: Client3 please send me your data Client3: here's the data xxxxxxxxxxxxxxxxxxxxx Client3: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Client3: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Client3: Data Transfer Complete Server: thankyou client3, you may close now Client3: Closing! Server: (update active socket list) Server: Operation Complete Server: Listening... So hopefully that clears up what's going to happen with the program :) should have this completed by march at the latest! -1 Febuary 2013- oh? what's this!? :D fully working prototype :) here's a cool/annoying/but still cool mechanic xD because of the way it uses winsock- it's actually multithreaded - which causes issues (hence why you see strange commands passing back and forth) - the multi-threaded "network extension" trying to pass into a single-threaded application - the NETWORK portion can handle it but not all lines get output into the log but that's only graphical- i fixed most issues but in doing so i made the code horrific xD another cool thing is i can open more than one on each computer - i'll have it fil all cores except one (leaving the last core for winsock) meaning if those are celeron duo in the cluster (as i think they are) - that's 20 nodes (plus ~3 on my main pc) ! AND the server can handle up to 1024 nodes! :D awesome -6 february 2013- so i'm writing the program which will keep the clients "alive" as such (by simply rebooting the process each time it "dies" based on a command i send via IRC most probably) and i had a little hiccup! xD [IMG][IMG][IMG][IMG] That was fun to clear up! xD regardless, here is a new UI and a better console log (less spammy) :) [IMG] -- MISSING LINK-- -06 February 2013- Same date but i've slept since the last post - it's like a new day to me :D so i've had another breakthrough my plan so far was to have a script hook into IRC where i can see which bots are active and whatnot but it seems winsock has completely given up and every IRC bot in existence (written in AHK) has completely stopped working... bugger! so i looked for an alternative, i thought "maybe a 2nd entire server? wait... wtf am i thinking" so i set to and simply made both programs clear all variables instead of reboot each time - they work seamlessly time after time :D i have a program log here [IMG] the significance of this is the IP Address, it's the first ever non localhost test and it worked fine :D - not to mention i told the script to disconnect- and it did- and the "keepalive" program booted it back up again :) (easy fix - on disconnect have it kill the keepalive program) there is a drawback to this though... my keepalive program is a bit... primitive as i cannot work out how to fetch the PID of the script as it's running through Autohotkey.exe (easy solution, just cant work it out at the moment) i got around this by having the client open up a ridiculously tiny UI [IMG] and have the keepalive program minimise it about 1 second after booting (no idea why it can't do this faster but it's only temporary) the keepalive now has a direct reference to the process and whatnot via the GUI's name - _!Client (believe me somewhere crushed into that GUI, there is a window title XD) [IMG] see! :p now this is very nasty but i'm not that worried about it as this is going on a computer which doesnt have a screen - i won't see this lol, but it could be done better i'm more worried about this issue :- the script works fine with ~7 clients on it but watch when i only use 1 or 2, (1 in this example) -- MISSING LINK-- it seems to go too fast or something and it overtakes the client, i'll have to look into this some more, but for now - so long as i use enough clients - it works :) oh! i almost forgot - a quick update on my main rig :) -- MISSING LINK-- -- MISSING LINK-- -- MISSING LINK-- -24 May 2013- WOOOOOOOOOOOOO!!!!! massive breakthrough!! probably read earlier that i was limited by the 64bit integer value - Well i found a lib which can handle infinitely large numbers- just slower so i've modified the algorithm to support it (with help from rseding91) Find_Primes(_Min,_Max) ;made by Rseding91 { if Greater(_Min, _Max) Switch := _Min, _Min := _Max, _Max := Switch Current_Num := _Min loop { if Greater(Current_Num, _Max) break IsPrime := Current_Num "|" Loop % Evaluate(nthRoot(Current_Num,2),"-1") { If !Remainder(Current_Num, Evaluate(A_Index, "1")) { IsPrime := "" Break } } if (isprime) { } Current_Num := Evaluate(Current_Num, "1") Primes .= IsPrime, Current_Num } Return SubStr(Primes,1,StrLen(Primes) - 1) } so this can now process infinitely large prime numbers! (albeit slowly :p) to combat the speed issue - for speed runs i'm keeping the previous faster algorithm onboard, so if i'm checking for primes UNDER the 64bit cap - it'll use the fast one, otherwise it'll use the slow one :) Cant wait to get this thing working properly :D i'm yet to have got this working over the internet - not entirely sure why... - will have to do some investigation however i do plan on contacting a local school or college and turning their entire complex into a giant cluster computer for this to run maybe 1000+ clients :D i cant imagine the speeds xD I'll keep you updated!