00:00:07 mediawiki seems like a heavyweight wiki 00:00:38 well, yes 00:00:47 (again) 00:01:22 heavyweight in what way? system requirements, features, lines of code... ? 00:01:40 system requirements (ram), and database 00:02:38 graue: what hardware are you running on? 00:03:30 so wikicities is out? 00:03:31 it's not going to have lots of traffic, so I don't think it will be a problem... 00:03:48 kipple_: it might 00:04:18 lament: I think it's scary, because we ultimately lose control and if wikicities goes down, all is lost 00:04:33 if graue is using BT to download and upload lots of porn, the site will be slow no matter how little traffic it has 00:04:40 mirrors are a way of dealing with network traffic 00:05:06 you can get (daily I think) db-dumps from wikicities too 00:05:06 calamari: you can get the DB dump on wikicities, no? 00:05:16 i don't own the server, it's shared, specs are here: http://textdrive.com/specs/ 00:05:33 calamari: and i trust wikimedia more than any individual with a server 00:05:38 to keep the thing up 00:05:49 me too. that's why we mirror it 00:06:25 lament: who is going to store them? Unless people are actively involved, I foresee the db dumps eventually going to a dead server, then wikicities goes offline, the dumps are attempted to be retrieved , and only then it is discovered that the erson keeping dumps was gone long ago 00:06:38 I trust 3 mirrors more than wikicities (which is NOT part of wikimedia) 00:07:15 calamari: anybody involved in anything esoteric is likely to very suddenly stop being involved 00:07:43 lament: btw, wikicities is not a wikimedia project 00:07:54 oops 00:08:25 With full-on mirrors rather than just backups, if the main goes down, you don't have to go begging after somebody for content, it's still there. 00:08:28 I fear what will happen if the ads on wikicities isn't enough to pay the hosting expences 00:09:58 anybody know if there is a read-only setting in MediaWiki? 00:11:29 * calamari checks #mediawiki 00:12:12 well, i have this usermode linux box running and i'm willing to devote it to the site 00:12:23 just might need help administering it from time to time as i'm about to travel 00:15:10 kipple_: yes there is a read only setting 00:15:17 great 00:15:29 then read-only mirrors should be easy 00:15:41 kipple: has edut redirecting been abandoned? 00:15:46 err edit 00:16:06 is it possible to do it? 00:16:35 kipple: not without modifying the source code 00:16:47 we should try to avoid that IMHO 00:16:49 I think it's not an option 00:17:05 there's a lot of problems involved 00:17:53 there should at least be an edit way for people on the mirrors to edit, or they will never be used for anything 00:18:00 err easy 00:18:25 then nobody will know when the mirrors go down, because they are unused :) 00:18:26 they don't have to be used. only be there in case the main goes down 00:18:34 ah 00:19:56 I agree with calamari - there should at least be a simple header("Location: /edit.php"); 00:20:43 guest users can't edit anyway 00:21:05 or that's my opinio 00:21:07 +n 00:21:07 * calamari has an idea 00:22:09 each mirror can redirect to the main site, unless the main site is down (pretty sure there's a way to test for that), in which case the read only local copy is shown. The main site would only allow connections that were from the mirrors 00:22:46 what is it called, the referrer? 00:23:39 that way people are forced to use the mirrors, rather than accessing the main site directly 00:23:52 I don't know. Would make it more complicated to set up. KISS? 00:24:06 is it really that complicated? 00:24:17 don't know 00:24:26 anything involving touching that code is risky 00:24:33 we wouldn't be 00:24:57 how would you prevent connections from non-mirrors? 00:25:06 then the server needs a list of mirrors 00:25:09 kipple: you'd only prevent them from the main page 00:25:18 i have an idea 00:25:24 make an Esolang:Mirrors page on the wiki 00:25:29 kipple: yeah it would.. but it'll need that anyways to send out dumps 00:25:29 listing the URL of the original, and the URLs of the mirrors 00:25:48 since that page will be mirrored along with everything else, any copy can be found from anywhere 00:26:09 um, the main shouldn't send out dumps. Only provide them for download 00:26:17 kipple: oic 00:26:27 oic?? what's that? 00:26:39 "oh, I see" 00:26:52 aha. anyway, that's at least my opinion 00:27:28 makes it easier to set up a mirror (i.e. no configuration is needed on the main site) 00:28:30 I see little reason to set up a mirror if nobody is going to use it, though :) 00:28:42 it's just for backup. 00:28:54 that's not what I meant.. but ok 00:29:13 I more or less agree with calamari, but I don't see why people are not going to use mirrors 00:29:27 I don't see a reason to use a mirror? 00:29:34 pgimeno: because you can't edit 00:29:50 why use a mirror if you can use the main? 00:29:53 if only editors can edit, that's not a problem 00:30:08 kipple_: "Please use a server near to you" 00:31:05 similar to what one gets when downloading a file from sourceforge 00:31:31 I don't think traffic will reach the point were that is necessary 00:32:06 I don't either.. it's not really a traffic problem.. I'm just concerned that the mirrors will evaporate without anyone knowing it 00:32:34 a rotating DNS could perhaps also help 00:32:40 but that's harder to set up 00:32:51 (similar to irc.freenode.net) 00:33:08 yeah that's too much trouble I think 00:33:29 here's how I see it: if you are afraid of something disappearing, take a backup yourself. All we should do is make sure the main site facilitates that 00:33:36 people aren't going to want to change their config, well unless it can be automated 00:33:52 kipple: that' 00:34:07 s the whole point tho.. otherwise, what's wrong with graue's site? 00:34:50 I don't follow you. Have I said there's something wrong with graue's site? 00:35:08 As long as there are enough alternative modes of communication (DirectNet and IRC and DirectNet and email and ... DirectNet :P )and enough auto-downloaders, I think just having one active main isn't a problem. 00:35:45 one thing they taught us in first aid training is never to say "someone call for help", because then no one does (someone else must already have done it) the better alternative is to pick someone (or in our case multiple people) to do it 00:36:27 Hence auto-downloaders rather than humans. 00:36:30 (Damn humans) 00:36:37 kipple: ?? not trying to imply anything wrong with his site 00:36:47 I'm not saying "someone call for help". I'm saying "YOU call for help" ;) 00:37:17 but, of course, you have a point.... 00:37:32 how can we be 100% sure a backup is taken..... 00:37:33 kipple: I didn't realize you were speaking directly to me individually 00:37:58 I wasn't really 00:38:11 then my point is valid 00:38:22 dang, it's hard sometimes to communicate by IRC :) 00:38:26 hehe 00:38:56 I'm glad graue has his site up.. it looks really nice 00:39:07 kipple_: by checking the mirror site ;P 00:39:12 The best way to guarantee it would be to upload from the main rather than trusting the mirrors to download. 00:39:38 why? 00:39:52 if the mirrors go down, upload fails as much as download 00:40:11 But the main would know it, and could put a big red banner on the page saying "THIS MIRROR IS DOWN!!!!!!!!" 00:40:32 you could still do that, without uploading 00:40:48 Hmm, I suppose you could check whether a mirror has downloaded... 00:41:06 I would like to be able to take a backup of the site WITHOUT being dependent on the current admin to give it to you 00:41:46 The problem with the download model is that you can't trust people to download - if the main goes down, it's possible that nobody would have backed it up. (cont. next line) 00:42:12 The problem with the upload model is that the (possibly non-existant) administrator of the main site needs to make changes for a new mirror to spring up. 00:42:31 unless it can be automated 00:42:36 So make the process of adding oneself to the upload list automated. 00:42:38 Damn. 00:42:41 You win the typing contest. 00:42:41 downloads can be motnitored 00:42:54 monitored 00:42:56 pgimeno: That would be significantly more difficult I think ... 00:43:13 Especially if it's via HTTP or whatnot... 00:43:56 really? "Last downloads: 80.35.19.122 2005-05-25 17:20" 00:44:09 php can monitor that 00:44:59 Oh, so the download would be through a PHP script? 00:45:09 could be 00:45:11 I thought there would just be a file floating on a server somewhere that got updated now and then X-D 00:45:46 even so, that file could be gotten through a PHP script 00:45:52 mmm php in a nutshell, to be published July 2005.. I'll have to ask for that one for Christmas :) 00:46:22 "gotten"? is that correct? 00:46:22 However, then there's another problem. 1) You'd need a daemon to actually do anything with that info, 2) there would still need to be a main-site mirror list for that to be useful. 00:46:45 Hmm, "gotten" ... I think so? Me not talk English. 00:46:59 retrieved? 00:47:10 AHH! LONG WORDS HURT GREGOR! 00:47:16 hehe 00:47:30 d/led? ;) 00:47:43 grabbed 00:47:50 leeched 00:47:55 purloined 00:48:07 downloaderized 00:48:12 anyway, just a special page with the last downloads seems sensible 00:48:59 Then it falls back to trusting humans - who's going to check that page to make sure everything is in order? 00:49:17 it can be in the same page as mirrors 00:50:03 "Last update: <,blah>" 00:50:19 Last backup 00:50:30 I don't know whether people would react when they saw "Last update/backup: <2002>" 00:50:31 then have a link on how to because a mirror site 00:50:39 become rather 00:50:54 Wait, didn't we decide that they wouldn't be mirrors proper, just backup sites? 00:51:13 I didn't realize that was decided 00:51:22 Okidoke 8-D 00:51:26 * GregorR rewinds. 00:51:44 the point is, you can do whatever you want with your own backup/mirror 00:52:01 for me a backup is sufficient 00:52:30 is it possible to download sithout downloading the entire database each time? 00:52:52 with rsync it seems so 00:53:08 yes 00:53:13 oh? cool.. didn't realize rsync could work with databases 00:53:20 the dump is a plain text file 00:53:25 it works on MySQL dumps as well :) (unless they are zipped) 00:53:32 nice 00:53:55 though it might get big if not zipped 00:54:02 probably not a problem for us 00:54:37 well, hopefully not, since rsync only sends the files that changed, right? or does it even do better than that and send a patch? 00:54:49 it sends a patch. 00:54:55 the dump is a single file 00:55:15 well, a forum with lots of daily traffic has a 50 Mb database 00:55:37 yes, but I don't think we'll get close to that 00:55:54 database dump, that is 00:56:16 I think we're in the ballpark of some weekly traffic 00:56:19 I think a 50 Mb dump is reasonable 00:56:43 is that before or after zipping it? (the forum example) 00:56:54 before 00:56:58 it's 6 megs zipped 00:58:37 I like the "Last backup" idea, but put it on the main page where everyone sees it 01:01:17 I wonder if it'd be possible to determine the last time anything was edited on the wiki.. then after a week it could say "Backup out of date, please help to preserve this wiki" or something like that :) 01:02:03 There could be a list of backups last week. then people could be encouraged to take weekly backups, and picking days when few others do 01:02:25 probably possible 01:03:04 I'm too sleepy to go on discussing 01:03:07 the "recent changes" page shows the last time something was edited 01:03:15 so good night all 01:03:18 it'd also be good to require some kind of contact information, like an e-mail address 01:03:24 pgimeno: night 01:04:16 although, that might raise privacy concerns 01:04:32 since the only way it would be useful is if it was also in the dump :) 01:04:47 yes, the dump contains every user's email address 01:05:18 I don't see that as a problem. WikiPedia does the same. if 01:05:19 -!- GregorR has quit (Remote closed the connection). 01:05:43 cool, then 01:06:49 -!- GregorR has joined. 01:07:05 Well, that was a pointless quit :P 01:07:07 the email should be optional anyway 01:08:35 BTW, if the database was sent compressed, rsync would be pointless, since the diffs would be irrelevent. 01:08:59 yes. that's why it shouldn't be compressed 01:09:17 Plus, rsync's traffic can be compressed, rather than the DB itself. 01:09:19 the question is how much bandwidth would it take to prepare the patch vs just sending the zipped file? 01:10:09 well, the whole point of rsync seems to be to conserve bandwith, so I think it is worth it 01:10:16 ok 01:10:54 The very first download would be significantly higher-bandwidth. 01:10:59 After that it would be far far less. 01:11:06 i don't think a couple megabytes a week matters to anyone in the first place 01:11:54 agreed 01:11:55 I live in the happy world where the esowiki is about 1.6GB and there's more data there than anybody could swim through in a lifetime :P 01:12:12 don't count on it :) 01:12:32 that post pgimeno made is the first in how many months? :) 01:13:25 FYI, the zipped dump for current pages of the english wikipedia is 900MB, so 1.6GB is perhaps a bit optimistic... 01:13:48 calamari: too many 01:17:01 YAAAAAAAAAAAAAAAAAAAAAAAAAAAAAY! GIKI PROJECT APPROVED!!! :) 01:17:45 * GregorR dances. 01:18:20 ha. you used that name :D 01:18:35 google says: Ghulam Ishaq Khan Institute of Engineering Sciences and Technology.. GIKI is a center of excellence in Pakistan for the natural sciences and Computing. 01:18:54 sounds geeky...... 01:25:58 kipple_: It was you who suggested the name, right? 01:26:12 yup 01:26:21 Well, it is a rawx0r name :) 01:26:30 Gregor's Wiki = Giki = Geeky = yeeeeh haw :P 01:28:08 I've had experience with bad project names, and this isn't one 01:28:24 just made a user on graue's wiki and got user ID 2 :) I take it not many has registered.... 01:28:25 OBLISK's original name was SupaRun ... it's embarassing just to say that ... what a stupid name. 01:30:24 cool, I'm now ID 3 <=K 01:31:20 in a couple of years, maybe I can sell that user on ebay for lots of $$$$$ ;) 01:31:58 I've heard people have actually sold slashdot users with low IDs on ebay 01:44:51 I despise anybody stupid enough to actually buy those, and idolize anybody with the marketing genius to be able to sell them :P 01:48:15 Yay! I've finished zeroing the harddrive on my web server. and it only took 13 hours.... :D 01:48:47 There may be people who would suggest that that's a bit extreme ;) 01:49:08 it was suggested as a way to get rid of the bad sectors 01:49:57 at least I didn't get any r/w errors, like I got with various HD utilities 01:50:59 "bit extreme" it is though! more than a trillion bits is definately extremely many :) 01:51:12 * kipple_ digs up the debian CD 02:40:09 -!- pgimeno has quit (Connection reset by peer). 02:51:02 -!- pgimeno has joined. 02:51:17 -!- wooby has quit. 03:04:06 pgimeno: up already? :) 03:06:50 bbl 03:06:51 -!- calamari has quit ("Leaving"). 03:10:05 -!- kipple_ has quit (Read error: 110 (Connection timed out)). 03:57:13 -!- wooby has joined. 03:59:10 -!- GregorR-L has joined. 03:59:24 hio 03:59:32 Hullo 03:59:38 * GregorR-L is just getting the Giki page up :) 03:59:59 neat 04:11:21 -!- graue has quit ("Are you a Schweinpenis? If so, type "I am not a Schweinpenis.""). 04:11:40 http://giki.sf.net/ 04:17:54 cool 04:18:06 been tinkering with moinmoin myself 04:18:12 wikis are so awesome :) 04:21:02 -!- malaprop has quit ("quit"). 05:39:09 So, what semi-common Wiki features should I adapt as Giki plugins next :P 06:43:36 -!- puzzlet has joined. 06:46:45 Hoi 06:46:49 hello 06:46:57 How goes? 06:47:33 i'm making a new esolang 06:47:39 first ever written in Hangul 06:48:01 http://puzzlet.org/puzzlet/%EC%95%84%ED%9D%AC~Ahui 06:48:30 Hmm 06:50:39 I will probably fail to write anything in this language :P 06:50:46 Seeing as that I don't even have the keyboard :P 06:51:34 Does the "Hello World" program print "Hello World," or "Hello World" in Korean? 06:52:17 "Hello, world!\n" 06:53:01 I assume that it can produce Hangul output? 06:53:10 yes. 06:53:34 it receives Unicode code point and prints 06:54:21 So where's the "??, ???!"program? (If Babelfish is smart ;) ) 06:55:25 http://puzzlet.org/puzzlet/%EC%95%84%ED%9D%AC~%EC%95%88%EB%85%95%ED%95%98%EC%84%B8%EC%9A%94 06:55:32 here is the similar one. 06:55:46 I can only assume "w00t" :) 06:55:47 since "Hello, world!" literally is an ackward expression. 06:55:59 awkward* 06:56:05 Hmm 07:02:09 -!- tokigun has joined. 07:03:26 Hullo 07:04:40 hello :) 07:04:49 How goes? 07:05:00 hmm 07:05:11 he wrote Ahui interpreter in Python 07:05:33 Ahh 07:06:38 puzzlet: i have to update interpreter for new spec 07:06:59 me either 07:11:45 pgimeno: hmm. i'm still holding the opinion that "provide wiki for esolangs" and "preserve esolangs" are two different tasks, and it just feels like this problem is being shoehorned into a solution that doesn't fit it. 07:12:16 ftp sites preserve content just fine. 07:12:50 what could be more KISS than that? 07:12:57 anyway, just my two cents. 07:12:59 * cpressey out 07:13:16 Does MediaWiki allow HTML? 07:14:30 partially, afaik 07:15:01 i have seen
, , so far 07:15:01 Does MoinMoin support plugins? 07:16:37 see http://moinmoin.wikiwikiweb.de/CategoryMarket?action=fullsearch&value=linkto%3A%22CategoryMarket%22&context=180 07:33:54 wow 07:33:57 that was cpressey 07:34:02 the deity talked 07:34:16 puzzlet: ahui sounds incredibly dirty in russian 07:34:53 incredibly? 07:35:20 like? 07:37:01 lol 07:37:11 ALL HAIL CPRESSEY!!!!!!!!!!! 07:37:13 * GregorR-L bows down. 07:37:20 * puzzlet bows down. 07:40:26 Anybody want to add any more wikis to Giki's "other wiki software" list? 07:40:32 http://giki.sourceforge.net/index.php?title=other%20wiki%20software 07:41:00 MoniWiki! 07:41:08 OK, add it :P 07:41:13 http://moniwiki.sourceforge.net/wiki.php 07:42:34 does enabling html codes with like [[HTML()]] count? 07:42:53 Any means of injecting HTML into the wiki *shrugs* 07:43:16 So yes, long story short. 07:44:00 Cikiwiki, tokigun's ioccc entry - http://page.tokigun.net/obfuscation/cikiwiki.php 07:44:32 LOL 07:44:36 That's awesome XD 07:44:40 in judgement 07:44:57 judgement in progress 07:52:35 -!- puzlet has joined. 07:53:27 Hello puzzlet - z :P 07:53:42 hi 07:53:45 puzzlet must die 07:53:49 yeah 07:53:51 ... 07:54:11 wondering why i have been disconnected 07:59:59 -!- clog has quit (ended). 08:00:00 -!- clog has joined. 08:00:11 -!- puzzlet has quit (Read error: 60 (Operation timed out)). 08:02:29 -!- puzlet has changed nick to puzzlet. 08:06:47 So. 08:07:14 La-Ti-Do 08:07:16 :) 08:09:15 Boredom 8-D 08:09:59 musician's extinction, maybe 08:16:59 -!- puzzlet has quit (clarke.freenode.net irc.freenode.net). 08:16:59 -!- cpressey has quit (clarke.freenode.net irc.freenode.net). 08:17:00 -!- cmeme has quit (clarke.freenode.net irc.freenode.net). 08:17:00 -!- tokigun has quit (clarke.freenode.net irc.freenode.net). 08:17:00 -!- wooby has quit (clarke.freenode.net irc.freenode.net). 08:17:00 -!- GregorR-L has quit (clarke.freenode.net irc.freenode.net). 08:17:01 -!- lament has quit (clarke.freenode.net irc.freenode.net). 08:17:01 -!- lindi- has quit (clarke.freenode.net irc.freenode.net). 08:17:15 -!- puzzlet has joined. 08:17:15 -!- tokigun has joined. 08:17:15 -!- GregorR-L has joined. 08:17:15 -!- wooby has joined. 08:17:15 -!- lindi- has joined. 08:17:15 -!- cpressey has joined. 08:17:15 -!- cmeme has joined. 08:17:15 -!- lament has joined. 08:19:16 YAY! 08:19:16 Netsplit! 08:26:04 -!- GregorR-L has quit ("Leaving"). 10:15:59 -!- comet_11 has quit (Read error: 110 (Connection timed out)). 10:28:54 -!- puzzlet has quit ("reboot"). 11:53:36 -!- tokigun has quit ("leaving"). 12:03:22 -!- puzzlet has joined. 12:30:10 -!- kipple has joined. 12:59:26 -!- CXI has joined. 13:02:26 cpressey: I agree that ftp hosting can perfectly cope with mere preservation; however the wiki is also a means to publish additional information about the language(s) which would otherwise require downloading files. In that sense, graue's idea about separating the wiki and the files deals with both preservation and additional information (in a too disconnected way for my taste, but it does) 13:33:40 -!- kipple_ has joined. 13:49:19 -!- malaprop has joined. 13:52:11 -!- kipple has quit (Read error: 110 (Connection timed out)). 14:33:54 yay. my website is finally up again! :D 14:34:00 (http://rune.krokodille.com/lang/) 14:43:11 cheers, kipple_! 14:47:33 it seems you finally managed to teach the HD which sectors to skip 14:47:47 who knows 14:47:54 could crash again 14:48:21 yes, if some rebel sectors appear 14:48:24 I left the last 30 gigs of the drive unpartitioned this time (that is where the problems were) 14:49:23 hm, it might be a problem of underventilation 14:49:47 could be. 14:50:07 I've changed the IDE cable, which was suggested on seagates web site 14:50:22 it was an old one 14:51:12 I've had temperature problems with disks 14:51:42 maybe I should put a thermometer inside the box to check 14:52:48 (and with cpu's; check http://www.formauri.es/personal/pgimeno/temp/dsc02325.jpg ) 14:53:17 haha 14:53:51 the cpu is not the problem. It has never crashed that way (even though it doesn't have a CPU-fan) 14:57:51 have you noticed the placement of the disk in the shot? there were two disks together before, but it seems that the lack of ventilation caused the temperature to raise to a point where touching the disk was even prone to causing injury 14:58:12 (plus the fact that each disk raised the other's temperature) 14:58:34 I have only one disk 14:58:48 (the others made too much noise, so I removed them) 14:59:05 hum, my theory is not very acceptable for your case then 14:59:46 could still be too warm in the cabinet. the lack of a cpu fan could be a problem 15:01:14 I kind of doubt it but of course if you check it you'll be more confident 15:02:23 maybe. the disk is mounted in a bracket in a 5.25" bay, so it has room on all sides as well. 15:33:06 -!- puzzlet has quit (Remote closed the connection). 16:13:23 yesterday (more than 12 hours ago anyway) someone said that Martijn van der Heide's work grabbing permissions from authors for distribution on WoS was hardly a huge work... that hit my sensible fiber 16:13:48 http://www.worldofspectrum.org/permits/publishers.html 16:14:56 I think it's a huge work 17:08:54 so i was tinkering yesterday with moin moin, and got it working if anyone wants to check it out 17:09:42 http://wiki.esolangs.org/ 17:14:33 * pgimeno checks 17:22:01 nice wooby 17:22:48 thanks 17:23:05 i know there are other ones, and i don't want to further divide effort... so i may or may not keep it up 17:23:14 in any case moinmoin is nice 17:23:28 wait until the final decision is taken 17:24:10 yeah 17:24:10 hmm 17:24:18 what was that about the GFDL being restrictive? 17:25:19 CXI: GFDL can have the 'invariant sections' 17:25:22 CXI: it has hassles about immutable portions. 17:25:57 oh, sure, but the GFDL as used by wikipedia specifies no invariant sections 17:26:57 It also has anti-DRM requirements. 17:27:46 http://people.debian.org/~srivasta/Position_Statement.html 17:28:15 mm, I remember that article 17:28:56 the only reason I mention it is that wikipedia compatibility would be nice 17:29:03 dual-license, maybe? 17:29:32 actually, hmm, that would only work in one direction 17:30:09 gragh, licensing is a pain :( 17:32:56 It's a pity Wikipedia has such a bad license, ya. 17:33:11 *goes to bed instead* 17:36:39 malaprop: they have "or later" clause there 17:55:04 wooby: are you around? 18:18:15 * pgimeno drives home 18:49:50 hey. has anybody seen the BF mandelbrot program by Eric Bosman? It's really cool! 18:50:23 http://www.microlyrix.com/software/bfdev/output.jpg 18:55:45 sweet 18:56:10 runs awfully slow in my java interpreter though... should have compiled it.... 19:01:29 back 19:01:39 very nice :) 19:02:02 kipple_: could you try if my optimizing interpreter makes it run faster? 19:03:46 how could it not.... my interpreter doesn't optimize 19:04:43 I don't have GCC on my win box. Will it comile with Visual Studio? 19:05:02 don't know, maybe it does with a bit of makefile tweaking 19:05:40 I was just curious anyway 19:06:16 malaprop: I've just seen your message in lang 19:06:39 ha. I can run it on my linux box. pedro's optimizing compiler on a 187 MHz box vs. kipple's lousy java interpreter on a 1.4GHz box 19:07:29 :) 19:07:40 actually it just tokenizes (no compiling) 19:08:03 ah yes 19:08:05 my bad 19:08:56 anyway... I'm also curious about the BF compiler written in BF :) 19:09:10 where's the BF code doing the output? 19:09:25 the Mandelbrot output I mean 19:10:02 find a dot 19:10:13 :P 19:10:48 the java interpreter runs it about 3 times faster than brfd 19:11:05 considering the difference in hardware that's quite good for brfd 19:12:18 hm, not bad but not as good as I expected 19:12:42 that's like a 3X speedup 19:12:51 hard to say 19:13:03 there's more than MHz that counts 19:13:09 lament: I didn't explain myself, sorry 19:13:25 I mean where to find the program 19:13:31 ah 19:13:33 in a sec 19:14:32 lament: did you write Smallfuck? 19:14:32 http://brainfuck.kicks-ass.net/files/mandelbrot.bf 19:14:43 thanks 19:14:48 pgimeno: yes 19:15:06 it's an awesome idea :) 19:15:32 I'm interested in whether Smetana can be done Turing-complete 19:15:38 er 19:15:43 that's exactly what i made smallfuck for 19:15:46 to show that 19:15:51 yeah 19:16:16 I've read about that and apparently the conclusion was that it wasn't 19:16:22 no 19:16:33 oh? 19:16:39 smetana programs can only have limited "memory" since the size of memory is limited by the size of the code 19:16:55 yeah 19:16:59 but within that constraint a smetana program can emulate a BF machine of arbitrary size 19:17:02 as I showed 19:17:33 i.e. it's as "turing-complete" as any physical computer :) 19:17:54 can I read the whole story somewhere? what I read in the voxelperfect wiki is not accurate it seems 19:18:12 what does it say there? 19:18:33 er, want to check yourself? 19:18:38 yeah 19:18:40 link? 19:18:43 basically that some programs don't stop or something 19:18:52 and that it's shown to not be Turing-complete 19:19:36 http://esoteric.voxelperfect.net/wiki/Smallfuck 19:20:42 hmmmm 19:21:08 * lament tries that 19:21:19 that just looks like an error in my compiler :) 19:22:19 oh 19:22:53 pgimeno: I've started a test running both interpreters on the same machine :) 19:23:06 heh, cool 19:23:31 * lament tries to figure out how to operate the smallfuck compiler 19:23:35 I'm looking in the backlog for that link with an optimizing compiler for Linux 19:23:40 man, when i wrote this i was still in high school :) 19:23:49 oh hehe 19:24:07 you mean this one: http://www.nada.kth.se/~matslina/awib/ 19:24:30 yeah! thanks 19:24:40 * pgimeno bookmarks the link this time 19:27:29 ummmmmmmmmmmmmmmmmmmmmmm 19:27:32 that program 19:27:33 a nice aspect of that compiler is its ability to compile self 19:27:37 *[>*] 19:27:46 it works perfectly in my compiler 19:27:53 terminates once it reaches the end of memory 19:28:03 so the info on wiki is simply wrong 19:28:20 oh, do you mean an smetana version? 19:28:26 yes 19:28:47 * pgimeno considers taking out smetana from the non-Turing-complete category 19:28:51 it sets all elements of memory to * and terminates 19:29:13 well, it's still not turing-complete :) 19:29:22 there should be a word for this particular type of ability 19:29:30 turing-complete with a memory constraint 19:29:34 maybe there even is a word 19:29:40 turing-bounded? 19:29:43 it seems a very common situation 19:29:49 -!- calamari has joined. 19:29:50 yeah 19:29:52 hi calamari 19:29:55 hi 19:30:13 non-infinite turing-complete 19:30:18 finitely turing-complete 19:31:15 yeah 19:31:17 something like that 19:31:26 i'll edit the wiki 19:31:33 yeah, actually bound is not a proper term 19:32:26 And binding already has a definition in languages, so... potential confusion. 19:32:59 yup, wrong choice on my side 19:33:50 if you want i can give you the smetana/smallfuck files 19:34:22 lament: Which of the wikis are you editing? 19:34:58 argh, that requires immediate attention 19:35:48 I was writing another message to the list on the evolution of the proposals but was having a break 19:36:11 malaprop: voxelperfect 19:37:56 may I paste some timings? 19:38:49 well, it's 4 lines total, I don't think it will be annoying 19:38:53 $ time ../brfd-1.0/brfd awib-1.0rc4.b < awib-1.0rc4.b > awib1 19:38:53 real 0m27.503s 19:39:00 $ time ./awib < awib-1.0rc4.b > awib2 19:39:00 real 0m2.001s 19:39:28 :) 19:40:29 it's computing the mandelbrot now 19:41:09 here too... 19:42:32 6.77 secs... but it seems it doesn't like my terminal 19:42:45 what language is that ".b"? befunge? 19:42:45 you need about 130 cols 19:42:52 brainfuck 19:42:55 ok 19:43:01 usually 19:43:14 maybe i should write fast brainfuck interpreter then 19:43:43 argh, I'm really dumb... the output was the compiled code rather than the executed one 19:43:55 pgimeno: mandelbrot in 6.77 secs?? impressive 19:44:08 that was compiling time O:) 19:44:14 ahh 19:44:49 11 secs will surely make more sense :) 19:45:35 and yes, the output is a beautiful Mandelbrot set 19:45:37 what? 19:45:46 the mandelbrot or something else? 19:46:03 the mandelbrot 19:46:10 real 0m10.381s 19:46:10 user 0m9.557s 19:46:10 sys 0m0.028s 19:46:18 impressive 19:46:37 I don't htink brfd will do i in 11 minutes here :) 19:46:59 spelling schmelling 19:47:00 I'm trying it right now 19:47:29 well it 19:47:47 's obvoius that it isn't cheating, at least :) 19:48:04 yup, that's crystal clear 19:48:14 you can see it noticably slowing down when it gets to the edges 19:48:49 edges? 19:48:50 yeah, the set itself is the slowest 19:48:58 edges of the set 19:49:00 ah 19:49:11 as the code is not exactly readable, it could have been nothing more than an advanced Hello WOrld 19:49:13 of course the set itself has to be the slowest 19:49:50 actually such thing as a bf compiler in bf is dangerous 19:50:10 it turns out to be honest but it could have generated a virus 19:50:12 not if it compiles to bf :) 19:50:21 a trojan, rather 19:51:10 real 4m16.412s 19:51:10 user 4m8.624s 19:51:38 and the comiled was 11 secs? wow 19:51:50 yup :) 19:52:18 brfd is still running here. I estimate it will take about 25 mins... 19:52:43 ouch :) 19:53:05 wonder how long the unoptimizing java interpreter will take.... 19:53:55 well er... longer 19:54:05 you think? 19:54:20 perhaps I should compile it to binary to get a more fair comparison 19:54:23 what I'm wondering is how long would it take for a non-optimizing compiler 19:57:33 calamari: have you tried how hard is a MediaWiki to set up? 19:57:58 MediaWiki is not hard to set up. 20:06:25 sorry, phone 20:06:44 have you tried it, malaprop 20:06:46 ? 20:13:16 MediaWiki? Yes, I run a couple. None public ATM, tho. 20:13:35 oh ok 20:16:29 pgimeno: nope.. no time. We will just be taking dumps, anyways, though.. right? :) 20:16:46 it seems so 20:16:59 how easy is to rebuild the database? 20:17:05 er 20:17:13 that question is for you, malaprop 20:17:15 mysql -u user -p -h host db_name < dump.sql 20:17:38 I like MoinMoin better, so I'll be sticking with it (won't be using it for esowiki though) 20:18:06 I'm happy to download the mediawiki dumps tho 20:18:08 given an empty database that's ok, but what if all you want is to update the database with the last changes? 20:18:27 I need to get going.. cya all 20:18:30 later 20:18:32 -!- calamari has quit ("Leaving"). 20:20:16 A partial update is probably possible, but a drop & reload would be simpler and saner. W'ere unlikely to ever have a db that is so large it takes more than a few minutes to do this. 20:20:33 okay 20:22:16 For an extreme example, the English Wikipedia (http://download.wikimedia.org/) is currently ~36G and can usually run in under 12h. So it's really really unlikely we'll have any kind of painful downtime. 20:22:42 so do you think that this scheme is feasible? 20:23:28 Entirely, yes. One of the advantages of MediaWiki is that it's immensly popular and featureful -- so we'll get nearly any feature we want without having to do it ourselves, and there's no worries that the maintiners will disappear. 20:24:05 I hope that graue cares a bit about the look tho 20:24:35 well, I want a feature to include java applets in the wiki. I think we might have to do that one ourselves... 20:24:46 am I the only one who sees the left bar disturbing? 20:26:08 kipple_: I don't think that the wiki is the best place to host java apps; a regular server seems to make more sense 20:26:29 anyway it's probably already done 20:26:46 left bar disturbing? 20:27:07 calamari's moinmoin server does not have it and I find it cleaner 20:27:15 I think it could do with some chaning of it's elements, but otherwise it's nice 20:27:29 The language List should be in it for instance 20:27:49 indeed 20:27:50 -!- GregorR-L has joined. 20:27:59 hi GregorR-L 20:30:32 pgimeno: brfd is done: 20:30:33 real 48m20.233s 20:30:34 user 48m9.140s 20:30:34 sys 0m1.210s 20:30:38 LOL 20:31:25 heh 20:31:54 hmm. does that mean that my unoptimizing interpreter will take hours? 20:32:20 you can try to make an estimation 20:32:43 nah. I'll just run it. 20:32:53 probably yes 20:34:08 so I think that most votes here favor MediaWiki, right? 20:34:20 (sigh) 20:34:29 looks that way 20:34:36 I'm still a bit unsure 20:35:03 wooby went the moinmoin way also with success 20:35:46 but yes, most people agree on mediawiki 20:35:58 okay, I'll follow the trend in the message 20:36:45 *whew* 20:37:28 were you reading the log or something, Gregor? 20:37:38 Yeah. 20:37:41 My brain melted. 20:37:43 hehe 20:44:35 malaprop: so are you offering to make backups? 20:48:03 er, you offered that in the list; do you want to set up a mirror? 21:06:59 what's the status of file uploading? 21:07:13 in voxelperfect, I mean 21:07:36 does anyone know? 21:25:45 No clue 21:26:54 pgimeno: I'm happy to do hosting, mirror, or backup as needed. So if we have a host and want mirrors, I'll be a mirror. 21:27:29 me too 21:27:47 i also have this domain, esolangs.org... which i'd point at where we eventually decide the main site should go 21:28:31 Dern, this fopen-ing of a web site in PHP is not working right for me >_< 21:28:55 GregorR-L: Does the local install permit fopen_wrapper? 21:29:18 grep fopen /path/to/php.ini 21:29:39 malaprop: the problem is that so far there's no mirror 21:29:39 erm, is allow_url_fopen, not fopen_wrapper, pardon 21:30:02 wooby: would you try MediaWiki? 21:30:18 pgimeno: sure i'll have some time to set it up later tonight 21:30:27 pgimeno: Of voxelperfect? Shall I contact graue and work it out with him? 21:31:13 malaprop: Yeah, I use it elsewhere, I just don't know what's causing it to fail in this one situation. 21:31:44 Ah. Were you venting or looking for help? :) 21:31:46 malaprop: I guess so 21:31:55 pgimeno: ok 21:32:02 well 21:32:07 * pgimeno is a bit confused 21:32:34 at the moment I need to complete the message reporting the current status 21:33:57 No, just to complain 8-D 21:34:18 so you don't still need to do that 21:37:56 pgimeno: about the left menu in MediaWiki. You can disable it if you're logged in 21:39:56 oh, I see 21:40:22 I would also suggest changing the default skin. 21:40:25 It is also possible to add skins with very different appearances. 21:40:42 there are 5 so far 21:40:53 I think the one WikiPedia uses looks best of the 5 21:41:00 Monobook or something 21:41:08 kipple_: Yes, that's Monobook. 21:42:04 looks very much like WikiPedia then, but at least then it's very familiar to most 21:43:00 "This [Quickbar settings] preference only works in the 'Standard' and the 'CologneBlue' skin." 21:43:06 :( 21:43:13 anyway I like CologneBlue 21:43:30 well, then what's the rpoblem? 21:43:47 no problem really 21:48:21 if it only had a bit of a margin... :) 21:48:37 pgimeno: Yeesh, install GreaseMonkey already. :P 21:55:28 nostalgia seems to be Good enough(tm) 22:01:10 Grr. I'm trying to make an InterWiki Content system, but I can't seem to fopen URLs, even though I know SF lets you >_< 22:01:48 IIRC that's a php setting 22:02:02 Repeat: "even though I know SF lets you" 22:02:11 oops, ok 22:02:14 :) 22:31:15 anyone aware of a language based on the idea of data redirection, like piping? 22:31:29 i recall seeing a BF derivative that did something similar 23:07:05 -!- GregorR-L has quit (Read error: 113 (No route to host)). 23:14:57 -!- wooby has quit. 23:36:45 pgimeno: I finished running mandelbrot.b in the java interpreter: 23:36:45 real 179m45.762s 23:36:45 user 179m31.570s 23:36:45 sys 0m1.880s 23:37:07 which means your interpreter is about 3.7 times faster 23:47:30 whoa 23:48:15 I've tried my interpreter with the -s option and the speed difference is not significant 23:48:31 what does the -s option do? 23:48:43 disable optimization (the s stands for slow) 23:49:08 what kind of optimization do you do? 23:49:31 I don't remember that very well :) 23:49:59 I think that it optimizes copy operations and things like that 23:50:24 addition to multiple cells, zero out cells... 23:51:28 -!- GregorR-L has joined. 23:51:37 * pgimeno looks 23:51:46 Any uber-javascript-hax0rs here? 23:52:07 kipple_: it's embarrasing not being able to answer your question :) 23:52:18 Slash anybody else who knows if it's possible to open a web page to get its content in Javascript? 23:53:21 GregorR-L: what do you mean by "get its content in JavaScript"? do you mean get the JS code which it has embedded? 23:53:57 No, I mean something like read the HTML from http://www.google.com/ 23:54:24 sure, wget http://www.google.com/ ; less index.html 23:54:38 In JavaScript ... 23:54:49 you mean from another web page? 23:54:52 Yeah. 23:54:58 hmm 23:54:58 I'm not good at explaining this obviously :P 23:55:18 oh, do you mean the JS code to *connect* to another page to grab the text? 23:55:25 Basically. 23:55:42 Other than, say, opening it up in an iframe and loading that (which may or may not work) 23:56:44 to me it sounds like it would be a security risk, say the page is in file://... 23:57:02 Hmm, 'tis a good point.