00:25:41 -!- RedDak has quit (Remote closed the connection). 00:26:47 -!- poiuy_qwert has joined. 01:14:50 -!- ehird` has quit. 02:01:49 -!- edwardk has joined. 02:07:59 -!- calamari has joined. 02:32:19 anyone awake who is versed in haskell syntax? playing with a toy language with similar syntax at the moment 02:32:32 oerjan? Oh oerjan? 02:32:34 *ahem* 02:32:39 but with no types 02:32:41 heya oerjan 02:33:23 basically i figured out a way to code up monads without types, so i've been playing with a framework somewhat between haskell and erlang for the last few days 02:34:06 ever used erlang? 02:34:37 nope 02:34:41 the idea in erlang is that you have atoms lists and tuples, and a few primitive types, but nothing else really and no type system. 02:35:30 what i went back and did was revisit the notion of a constructor in a typeless setting, so now i just say that a given constructor has some 'arity' associated with it. i.e. cons has arity 2 and nil has arity 0, but avoid specifying the types of the terms that go in each slot. 02:36:01 ok. similar to prolog i guess. 02:36:16 so in erlang you'd have to say {cons, x, xs} or {nil} whereas in this setting you can say arity 2 Cons, then Cons x xs and the default arity is 0 02:36:29 sorta, except it can be written applicatively because of the basic lazy semantics 02:36:30 (i vaguely recall erlang being descended somewhat from prolog, btw) 02:36:53 so you get haskell style pointfree syntax 02:37:27 yeah 02:37:31 so the dilemma comes about with monads, right since traditionally you think of a monad as doing different things based on the type it is inhabiting 02:38:15 pretty essential if you want more than one with the same syntax 02:38:15 so if you don't have types, haskell style monads seem like they are right out without passing around some sort of additional parameter that indicates the 'type of monad you are in' and using that. i.e. the haskell dictionary passing style 02:38:26 well, turns out you don't need that after all =) 02:38:46 the trick is the monad laws 02:38:59 i have thought similar thoughts 02:39:16 so for a concrete example lets define the identity monad 02:39:17 return = Ok 02:39:21 Ok x >>= f = f x 02:39:39 now Ok is a constructor which can be used as a tag when pattern matching on >>= so it knows what case its in. 02:39:45 mzero = Nil 02:39:50 Nil >>= f = f x 02:39:51 er 02:39:55 Nil >>= f = Nil 02:40:07 ok, so now we have the Identity Monad, and a Maybe monad 02:40:29 but what we need to do is anticipate that anywhere we could use the resulting monad we could also get 'Ok' 02:40:41 runReader (Reader f) = f 02:40:46 runReader (Ok x) = const x 02:41:19 so, then we consider something like 'ask' which seems to behave differently depending on context. 02:41:23 ask = Reader id 02:41:24 yep, you need a return that can be used everywhere. same solution as i thought of. 02:41:49 so the trick there is to autolift that into the reader transformer 02:41:49 there is some limitation though. 02:42:17 there are, in that you always have to tag it with a constructor 02:42:18 you can only use this for monads where >>= is strict in the first argument. 02:42:23 there is no 'true identity monad' 02:42:27 that is also true 02:42:35 Reader is not really 02:42:39 at least strict insofar as the outermost tag 02:43:28 otoh if you use something like runReader, maybe you can avoid that. 02:43:35 yeah 02:43:38 so the other fix 02:44:01 was to do something like the solution i want to play with for how to handle the monomorphism restriction and numerical classes 02:44:11 which reverts to carrying around the type parameter 02:44:41 if we define : to be an infix function, and + to be an arity 2 constructor you can say things like 02:44:59 a + b : Int = mp_add (a : Int) (b : Int) 02:45:21 haskell uses :: 02:45:22 and so on and so forth as a way to define how to recursively evaluate an expression as an integer 02:45:39 yeah the : in that case is an actual operator in my toy language. and i never liked haskell's :: =) 02:45:49 ML uses : 02:45:51 yeah 02:45:56 and the opposite for lists 02:46:09 i'm : for 'types' in this case and ; for lists 02:46:45 so with that same machinery you can make 'Return' and >>= both into constructors and define a similar decomposition 02:47:10 Return x : List = (x; Nil) 02:47:49 m >>= k : List = concat (map k m) 02:48:23 then the same expression consisting of Returns and >>='s can be reused in different monads until you apply a : to 'type' it 02:48:46 there are some tricks that still entails because : T should be idempotent, etc. 02:48:58 didn't you play around with a language with lots of types and (partially undecidable) inference a while ago? 02:49:03 and basically all : T is doing it taking the place of runT 02:49:34 yeah this is the same language i just stripped out the types for a while, since the non-type aspects (the substructural annotations and the theorem proving parts) are the parts I care about 02:49:55 so i wanted to see how far i could go with no types in the traditional sense, just detected pattern match failure 02:49:56 oh 02:50:08 basically its if i use ndm's CATCH tool as my type system ;) 02:50:25 or ESC/Haskell without the /Haskell =) 02:50:59 i figured it was a much stronger result to be able to duplicate existing techniques with the machinery than to just layer another abstraction on top 02:51:17 ic 02:51:57 otherwise the language remains unchanged, optimistic evaluation with lazy semantics, still has the theorem proving bits, but none of those pesky types 02:52:17 having the monads above lets me stay lazy and not go crazy 02:52:52 constructor elimination becomes of course a critical optimization step because EVERYTHING is tagged and constructors are global 02:53:11 so my apply/dispatch mechanism is uglier 02:53:44 and as a result of the above reasoning its necessary for me to allow extension of the same function (say >>=) in many locations in the code 02:53:57 so its hard to do separate compilation 02:54:01 but its a fun mental puzzle 02:54:06 "We're all mad here. I'm mad, you're mad." "How do you know i'm mad?" Said Alice. "You must be." Said the Cat, "or you wouldn't have come here." 02:54:26 (regarding going crazy ;) ) 02:54:51 =) 02:56:11 http://comonad.com/r5.opt 02:56:19 is sort of a spew of random gibberings in this setting 02:57:23 arity declarations are kind of like haskell fixity declarations, but for right now fixities are er. fixed coz i'm lazy and haven't added them =) 02:57:47 -!- calamari has quit ("Leaving"). 02:57:49 \x. foo ~= \x -> foo since there is no legal use for a . inside a pattern there 02:57:58 and its closer to the lambda calculus formalism 02:58:58 i like the fact that it sort of naturally subtypes back and forth between the monads (Nil and Nothing collapse, etc) 02:59:12 the Identity monad is just a Maybe monad where Nil is never used, etc. 02:59:36 i had an earlier version with "types" for the monads before i realized i could live without the type tags 03:00:40 http://comonad.com/r3.opt 03:01:45 basically anything without a = in the line is taken to be an 'axiom' which is just a function that any time the pattern matches returns true, and if none of the patterns match anywhere it returns false 03:02:03 and i do prolog/erlang style structural equality tests if you use the same term in a pattern twice 03:02:06 x <: x 03:02:09 for instance 03:03:45 hm... you get sort of a supermonad that includes all the others... 03:03:51 but plumbing around the monad type everywhere got ugly, i did a version with a sort of implicit type-level reader monad and another draft using dynamic binding and delimited control to change a dynamically scoped 'current monad' variable with pattern matching support on dynamic variables 03:03:53 yeah 03:04:16 brings to mind that somewhat horrid 'unimo' paper iirc 03:04:52 but basically i find it amusing that it lets me express a subtyping relationship among the different monads thats more or less impossible to express in a hindley milner style formalism 03:06:02 i started down this slippery slope with http://comonad.com/reader/2007/parameterized-monads-in-haskell/ 03:06:35 one thing i like about it is the reductions are confluent even if you are locally working in a different 'sub-monad' 03:07:09 i.e. if you work with just returns and >>= you stay in 'identity' until you mix in an 'ask' then you get lifted out into Reader, then maybe you 'get' so you get placed in a state monad, etc. 03:07:23 then when you runFoo you peel off the layers of the onion 03:07:31 and you coerce the types to fit 03:08:02 so locally you can avoid having to pay the overhead of the full monad you are in 03:09:19 which when you add in constructor-elimination should open up a lot of optimization opportunities that you never get exposed to in haskell because you can't see them through the monad-noise 03:10:09 the pain in the ass comes from the fact that i have to try to use control flow analysis to see if you ever do the wrong thing ;) 03:11:39 still playing with it 03:11:48 but i thought it was a neat way to tackle the 'untyped monad' 03:12:09 as opposed to the dictionary passing style mechanism that dpiponi used in his toy untyped monadic language 03:12:40 coz passing dictionaries everywhere explicitly gets rid of the nice syntactic benefits of having monads 03:21:37 i guess a mathematician if necessary would put the monad as index on the >>= 03:22:26 i played with carrying around an explicit type using the : as a reduction operator, which lets you do things like the anonymous reader monad as well 03:22:36 not sure if thats a win though 03:23:00 the : reduction technique is how i'm thinking about handling the monomorphism restriction though 03:23:05 its a funny way to view the world though 03:23:09 + as a constructor 03:23:26 then applying an evaluation function (:) to the constructed value the evaluate it 03:24:36 your mixing of monads might be a bit awkward if the transformers don't commute 03:24:48 yeah a known issue =/ 03:25:19 you may have to inject a runFoo or a : t to disambiguate 03:27:06 x : READER = Reader (runReader x) 03:27:33 so that its idempotent 03:28:04 then i can stick to my 'typeless monad' approach and use (:) as a disambiguation operator letting it retain its 'type-annotation' like connotation 03:51:22 f(15) = 16; f(34) = 92; f(13) = 8; f(20) = ??? 03:52:30 huh? 03:52:57 is this all the information about f you've got? 03:54:16 13 is a prime, and 8 is the previous fibonacci number 03:54:27 that's all the information i have 03:54:41 ah one sec 03:54:48 just looked at my screen i think i know it 03:54:56 oh, edwardk 03:55:14 you were the one with the language that no one but oerjan understood, right? 03:55:51 > let f x = x*4 - 44 in map f [15,34,13,20] ===> (10:55:49 PM) Lambda Bot: [16,92,8,36] 03:56:01 yeah 03:56:38 its changed into a simpler language with basically the same issue ;) 03:56:58 :D 03:57:32 bsmntbombdood: anyways that should answer your question 03:58:03 the key insight for me was the f(15) and f(13), i backed that out to guess f(11) and used f(34) to test its linearity 03:58:31 aaah 04:00:35 exercise: now, if it hadn't been linear, why wouldn't it be a good idea to try a second degree polynomial? ;) 04:00:51 heh 04:01:11 coz you could always find one? =) 04:01:21 edwardk wins the prize! 04:01:32 i knew that math degree was good for something ;) 04:02:11 actually thats a perfectly good use for a second degree polynomial for these silly underspecified problems =P 04:03:13 probably 04:03:15 what about f(361) = 22; f(121) = 14; f(81) = 12; f(25) = ?? 04:03:31 * edwardk watches as #esoteric does bsmntbombdood's homework =) 04:03:42 not my homework! 04:03:50 (my dads) :P 04:04:02 look at f(121) and f(81), do the same trick i did above 04:04:05 in this case we are looking at squares 04:04:15 oerjan: ? 04:04:45 all the arguments are squares 04:04:46 'every 20 it changes by 1', so find 0 04:05:07 rise over run 04:05:10 sqrt(n)+3 04:05:21 in fact 04:05:57 i'll buy that 04:06:45 wow 04:06:46 thanks 04:10:40 oerjan is right in that case since that one fails the naive linearity test: > let f x = x/20 + 7.95 in map f [81,121,361,25] ==> Lambda Bot: [12.0,14.0,> 26.0 < ,9.2] 04:11:08 and it looks ugly when you try it linearly ;) 04:13:05 so do you think people would like to play with a (sort-of) untyped lazy language with just constructors and easily extensible cases? it opens up a haskell-like language to the erlang-like use cases of long-uptime systems, etc. 04:13:22 since its easy to code-swap in that sort of system 04:24:37 -!- Sgeo has joined. 04:36:16 -!- edwardk has left (?). 04:52:05 -!- poiuy_qwert has quit. 05:20:22 -!- GreaseMonkey has joined. 06:18:38 -!- Sgeo has quit ("Ex-Chat"). 06:20:30 -!- Guilt has joined. 06:20:41 Just a tiny game I've been designing in PEBBLE. . . 06:20:54 (PEBBLE being a macro language I devised which compiles to Brainfuck) 06:21:16 oh? :) 06:21:18 http://pikhq.nonlogic.org/pebble.php If you need to ask me questions, do so later; I'm going to bed, since it's late. 06:21:27 Sorry. 06:21:32 okay :) nighty night 06:21:35 getting up early-- 06:21:36 Enjoy Brainfucking with everyone else. . . 06:21:40 If they're here. 06:21:54 i will! :) 06:50:56 -!- g4lt-sb100 has joined. 06:54:21 -!- sp3tt has quit (Read error: 104 (Connection reset by peer)). 07:00:08 -!- Guilt has quit ("CGI:IRC at http://freenode.bafsoft.ath.cx:14464/ (EOF)"). 07:05:52 -!- g4lt-mordant has quit (Read error: 113 (No route to host)). 07:20:45 -!- tokigun_ has quit (Read error: 104 (Connection reset by peer)). 07:22:30 -!- tokigun has joined. 07:38:49 -!- GreaseMonkey has quit ("gtg bye"). 07:46:00 -!- oerjan has quit ("leaving"). 07:46:32 -!- puzzlet_ has joined. 07:46:42 -!- Guilt has joined. 07:50:52 -!- puzzlet_ has quit (Read error: 104 (Connection reset by peer)). 07:51:19 -!- puzzlet_ has joined. 07:51:32 -!- puzzlet__ has joined. 07:51:44 -!- puzzlet__ has quit (Client Quit). 07:59:59 -!- clog has quit (ended). 08:00:00 -!- clog has joined. 08:06:37 -!- puzzlet has quit (Read error: 110 (Connection timed out)). 08:12:55 i was wondering. would the equivalent of code-compression in brainfuck involve functions? :) 08:13:12 brainfuck code is kinda' huge 08:16:41 brainfuck code looks easy to compress, you know. 08:17:01 with lz, you could put those identified patterns together 08:24:07 yes, it's easy to compress. 08:24:23 try bzip2 :) 08:25:42 lament: no, at executable code level. 08:25:56 upxing gave me a 14% compression thing. 08:26:10 14% of uncompressed executable data 08:26:36 i was wondering if anybody tried doing code compression. :) 08:26:47 so you can also identify reusable brainfuck patterns 08:26:52 like a library of sorts. 08:27:15 hmm. the next best thing: a portable brainfuck library :D 08:27:17 hehe 08:27:18 i'm not sure whether "at executable code level" is meaningful. 08:27:38 there're several macro preprocessors for brainfuck, though. 08:27:42 lament: http://guilt.bafsoft.net/downloads/wip/Brainfuck.tar.bz2 check this out. 08:28:02 LostKng.b compiled to an executable of size 2 odd megs. 08:28:12 oh, i see 08:28:53 you know, which is why i wondered if there would be a way to reduce code. 08:29:05 like combining a compiler with a compressor. ;) lol 08:29:08 yes, of course. 08:30:15 hmm. one easy way to partition it is to take loops which lead to the same text pattern. make them functions with labels. call them instead of jumping to them, and return back. 08:30:37 that way you could save considerable space. 08:30:55 you're not passing any data through the stack, which makes it a little slower, but smaller 08:31:14 also, identify patterns of brainfuck code that can be compiled into single instructions 08:32:52 ie balanced loops 08:34:15 i'm already coercing shifts and additions 08:34:29 is that what you are talking about? 08:35:44 er. balanced loops 08:35:45 er. balanced loops? 08:35:53 didn't get that one? 08:36:16 loops which don't move the memory pointer 08:36:21 eg [->++<] 08:36:40 that don't move it? 08:36:55 the memory pointer is unchanged by the loop 08:37:13 the effect is m[mp+1] = m[mp] * 2; m[mp] = 0; 08:37:20 well. but it's used to increase the adjacent cell by twice the current cell value 08:37:27 yea 08:37:32 i'm not sure what the "but" in your sentence means. 08:38:06 for small shifts it's okay. but with a moving loop and shift it's impossible. 08:38:37 like, to set all values to zero (of lower and current cell): [[-]<] 08:39:23 that's not a balanced loop. 08:39:27 yea 08:39:36 but [-] is 08:39:38 oh okay :) 08:40:06 hm. i see. a balanced loop contains balanced loops and doesn't change the memory pointer position\ 08:40:42 alright. will read a bit about this :) 08:40:43 brb 08:40:47 good night 08:55:03 -!- sebbu2 has changed nick to sebbu. 09:00:48 -!- RedDak has joined. 09:48:24 -!- SEO_DUDE55 has quit (Read error: 104 (Connection reset by peer)). 09:58:39 back 09:58:45 lament, you slept? 10:14:34 -!- SEO_DUDE55 has joined. 10:35:43 -!- Guilt has quit ("CGI:IRC at http://freenode.bafsoft.ath.cx:14464/ (Ping timeout)"). 10:35:57 -!- Guilt has joined. 10:41:13 -!- SEO_DUDE55 has quit (Remote closed the connection). 11:26:35 -!- Guilt has quit ("CGI:IRC at http://freenode.bafsoft.ath.cx:14464/ (EOF)"). 11:52:18 -!- ehird` has joined. 12:13:11 -!- helios24 has quit (Read error: 60 (Operation timed out)). 13:03:01 -!- rutlov has joined. 13:03:35 -!- helios24 has joined. 13:09:01 -!- rutlov has left (?). 13:18:11 this time i actually understood (partly) what edwardk said! 13:18:19 oh the joy 14:26:01 dadadadom 14:26:10 what did he say 14:26:36 I was quoting Beethoven. 14:27:44 edwardk 14:27:46 i meant 14:29:57 ahell of a lot 14:42:14 even though i'm pretty sure i understood, i am certainly not confident enough to try to explain it to you, so check the logs :P 14:43:36 i didn't get all of the monad stuff, but i mean last time i didn't understand one single sentence completely, i'm pretty sure :P 14:44:23 i did understand the idea, that was pretty obvious, but he's got a lot of words 14:55:10 SimonRC: Leck mich im Arsch 14:56:08 (just quoting beethoven) 14:56:15 mozart 14:56:16 actually 14:56:48 damnit 14:57:15 stupid identical old guys 15:37:13 -!- RedDak has quit (Remote closed the connection). 15:58:43 -!- jix has joined. 17:07:49 -!- oerjan has joined. 17:21:05 bsmntbombdood: ?! 17:30:32 beethoven and mozart, presumably. famous decomposers. 17:35:09 -!- SEO_DUDE55 has joined. 17:44:51 * SimonRC goes 18:15:39 -!- _D6Gregor1RFeZi has changed nick to GregorR. 18:49:08 -!- sp3tt has joined. 19:06:35 -!- oerjan has quit ("leaving"). 19:43:27 -!- sebbu2 has joined. 20:02:39 -!- sebbu has quit (Connection timed out). 20:13:29 * ehird` is a famous decomposer 20:17:39 like a maggot? 20:17:48 and beethoven and mozart 20:18:11 no, those two are decomposing composers 20:18:47 beethoven, for example, wasn't famous for his decomposing until after his death 20:21:14 oerjan 20:21:14 17:29:47 20:21:14 beethoven and mozart, presumably. famous decomposers. 20:21:19 so ha 20:53:29 -!- RedDak has joined. 21:06:24 -!- bsmnt_bot has quit (Read error: 145 (Connection timed out)). 21:14:37 -!- bsmntbombdood has quit (Read error: 110 (Connection timed out)). 21:39:06 -!- jix has quit ("CommandQ"). 22:03:41 -!- oerjan has joined. 22:10:42 -!- bsmntbombdood has joined. 22:48:33 how much work is a compiler allowed to do and still be called a compiler? 22:52:07 just about anything? 22:52:39 remember that a compiler cannot access actual runtime input, which limits things 22:52:53 only a little 22:53:13 * ehird` thinks that in /me's perfect language, compiler and interpreter would be the same word! 22:59:15 -!- ehird` has quit. 23:01:32 -!- ehird` has joined. 23:43:14 bsmntbombdood: Compilers are just language translators that target languages not intended to be read by humans. 23:43:34 bsmntbombdood: If your program does something aside from language translation, it's not a compiler. 23:44:03 i was thinking about optimizations 23:44:11 I guess I can remove the clause "not intended to be read by humans" 23:44:46 bsmntbombdood: Optimization is a component of the translation. It's like translating a phrase into a coined expression instead of into a roundabout explanation. 23:45:05 (To compare it to real language translation) 23:47:29 -!- RedDak has quit (Remote closed the connection).