Compare commits

...

532 commits

Author SHA1 Message Date
Jakub Doka a3355a59c0
update on incomplete example 2024-11-24 18:55:30 +01:00
Jakub Doka b63b832383
wut 2024-11-24 18:51:20 +01:00
Jakub Doka 116f045a5f
adding defer 2024-11-24 18:50:55 +01:00
Jakub Doka 784d552c1d
fixing scoping bug 2024-11-24 16:43:45 +01:00
Jakub Doka 24a3aed360
implementing struct method for non generic contexts 2024-11-24 16:17:57 +01:00
Jakub Doka 58ee5c0a56
in progress of adding methods 2024-11-24 14:47:38 +01:00
Jakub Doka 9dfb2eb606
fg 2024-11-24 11:26:38 +01:00
Jakub Doka 5df4fb8882
changing the gcm to not mutate nodes in recursive functions 2024-11-23 19:47:17 +01:00
Jakub Doka 86ca959ea3
removing needless data dependencies 2024-11-23 17:23:33 +01:00
Jakub Doka f353bd5882
passing down the inference of 'void' to statements 2024-11-23 15:33:28 +01:00
Jakub Doka cad0a828d0
updating tests 2024-11-23 15:28:27 +01:00
Jakub Doka fb119bc6eb
allowing compatison of types 2024-11-23 15:28:02 +01:00
Jakub Doka aa83ed2ec9
fixing the annoyance 2024-11-23 14:19:47 +01:00
Jakub Doka fb11c94af4
fixing some bugs and extending runtime of programs 2024-11-23 12:35:16 +01:00
Jakub Doka b030b1eeb7
fixing user posts not being displayed 2024-11-23 11:14:03 +01:00
Jakub Doka 4856533b22
making sure dependencies are formatted 2024-11-23 10:31:38 +01:00
Jakub Doka d8d039b67a
adding capability to run posts and displaing cumulative runs 2024-11-23 10:18:05 +01:00
Jakub Doka b760d9ef75
improving theming 2024-11-23 08:38:41 +01:00
Jakub Doka e587de1778
fixing some lighthouse issues and bad coloring of icons 2024-11-23 00:59:58 +01:00
Jakub Doka b12579ff65
adding import count to posts 2024-11-23 00:06:25 +01:00
Jakub Doka 0aa355695a
fixing the buggg, finally 2024-11-22 21:01:45 +01:00
Jakub Doka 4a857d2317
garbage 2024-11-22 19:50:59 +01:00
Jakub Doka 2253ac6198
fixing some bugs 2024-11-22 19:50:36 +01:00
Jakub Doka 05a7bd0583
chaning css for no reason 2024-11-18 13:10:44 +01:00
able ab55ec0240 Merge pull request 'point the profile button at a users fully qualified profile page' (#24) from able-patch-1 into trunk
Reviewed-on: https://git.ablecorp.us/AbleOS/holey-bytes/pulls/24
2024-11-18 06:09:19 -06:00
able 8353ab58a5 point the profile button at a users fully qualified profile page
Signed-off-by: able <abl3theabove@gmail.com>
2024-11-18 06:08:46 -06:00
Jakub Doka f527d61c1e
limitting request body size 2024-11-18 11:28:47 +01:00
Jakub Doka f3879cb013
fixing js errors, adding password change form 2024-11-18 10:31:30 +01:00
Jakub Doka e89511b14c
fixing position reporting for optimized returns 2024-11-17 22:26:31 +01:00
Jakub Doka 1c135a3050
adding interesting asert 2024-11-17 21:43:02 +01:00
Jakub Doka f83194359c
fixed yet another off by 1 2024-11-17 21:35:41 +01:00
Jakub Doka becd5c4b7f
well? 2024-11-17 21:30:20 +01:00
Jakub Doka 37dd13cab2
brah 2024-11-17 21:27:30 +01:00
Jakub Doka bc2dd82eb7
welp 2024-11-17 21:25:22 +01:00
Jakub Doka aa2de502cc
saving 2024-11-17 21:09:36 +01:00
Jakub Doka 542c69fd60
changing case checking to a warning 2024-11-17 20:57:10 +01:00
Jakub Doka 95e9270fef
adding case checking 2024-11-17 20:04:53 +01:00
Jakub Doka fe5a8631f6
fixed a bug of not marking idents as used 2024-11-17 18:44:24 +01:00
Jakub Doka 8892dd729a
fixing the false return location 2024-11-17 18:15:58 +01:00
Jakub Doka a7718e1220
cleaning up some code 2024-11-17 17:14:44 +01:00
Jakub Doka e079bbd312
forgot to support explicit enum type 2024-11-17 16:30:59 +01:00
Jakub Doka 12b9d43754
adding minimal enums 2024-11-17 16:25:39 +01:00
Jakub Doka 397b2a4b1b
fixed a stack prelude postlude being needlesly generated + struct can now be compared 2024-11-17 10:06:10 +01:00
Jakub Doka 12bb7029b4
fixed float comparison in the vm 2024-11-16 21:38:10 +01:00
Jakub Doka a64383e72b
saving 2024-11-16 20:52:38 +01:00
Igor null 2034152c83 added syntax highlighting to depell 2024-11-16 13:48:31 -06:00
Jakub Doka 13714eb513
removing stable feature supression 2024-11-16 17:41:40 +01:00
Jakub Doka 4088bd18b1
fixing compilation error for ableos 2024-11-16 17:29:30 +01:00
Jakub Doka e94b812b3b
removing the regalloc dependency 2024-11-16 14:22:34 +01:00
Jakub Doka e5d6b35f66
removing needless copies to zero register for unused values 2024-11-16 13:42:17 +01:00
Jakub Doka e6df9b6b01
cleanup 2024-11-16 11:46:59 +01:00
Jakub Doka baa70d3f12
removing needless copy into ret register 2024-11-16 10:16:35 +01:00
Jakub Doka ec4499e519
removing duplicate un instruction 2024-11-16 10:05:56 +01:00
Jakub Doka 085c593add
removing needless truncation of zeroth register 2024-11-16 10:03:08 +01:00
Jakub Doka 867a750d8f
hopefully this is less of a mess now 2024-11-15 23:18:40 +01:00
Jakub Doka b1b6d9eba1
fix the inference on itf 2024-11-15 22:53:22 +01:00
Jakub Doka 12be64965f
maybe fixed mandelbrot 2024-11-15 22:35:03 +01:00
Jakub Doka 7058efe75c
reverting phi transformation 2024-11-15 19:32:04 +01:00
Jakub Doka afc1c5aac5
orginizing null checks better to get more peephole hits 2024-11-15 14:36:33 +01:00
Jakub Doka 83146cfd61
adding a simple peephole for phis 2024-11-15 13:06:03 +01:00
Jakub Doka bb625a9e19
some cleanump and ironing out bugs in new regalloc 2024-11-15 12:04:05 +01:00
Jakub Doka 81cf39b602
adding more info for assert 2024-11-14 21:57:51 +01:00
Jakub Doka e4da9cc927
adding regalloc, fixing needless semicolon insertion 2024-11-14 21:50:10 +01:00
Jakub Doka 454b0ffd1c
adding regalloc option 2024-11-14 21:34:31 +01:00
Jakub Doka 981c17ff19
fixing function destinations 2024-11-14 20:25:52 +01:00
Jakub Doka d01e31b203
fixing stack return values 2024-11-13 16:18:21 +01:00
Jakub Doka 9cb273a04b
edge case of returning stack from inlined function 2024-11-13 15:56:37 +01:00
Jakub Doka 2e2b7612d9
some cleanup 2024-11-13 15:45:45 +01:00
Jakub Doka f493c2776f
forgot to fix return 2024-11-13 15:27:35 +01:00
Jakub Doka f77bc52465
fixing unchanged parsed file 2024-11-13 15:25:27 +01:00
Jakub Doka f524013c34
making use of zero register 2024-11-13 10:28:16 +01:00
Jakub Doka 3c86eafe72
fixing another problem with rescheduling 2024-11-13 08:49:25 +01:00
Jakub Doka 0d87bf8f09
removing browser from nontest envs 2024-11-12 22:30:10 +01:00
Jakub Doka e5a4561f07
very funny fix 2024-11-12 21:54:23 +01:00
Jakub Doka b71031c146
prolly fix 2024-11-12 21:12:57 +01:00
Jakub Doka dd51961fbb
adding assert for better error 2024-11-12 21:10:42 +01:00
Jakub Doka 63f2a0dac0
well... 2024-11-12 20:59:12 +01:00
Jakub Doka 4ec88e3397
adding pointer edgecase 2024-11-12 20:42:04 +01:00
Jakub Doka f1e715e9bd
refactoring truncation 2024-11-12 19:02:29 +01:00
Jakub Doka 80fd0e89b4
fixing the inline flag delegation with generic functions 2024-11-12 17:32:20 +01:00
Jakub Doka 9949086011
allowing eca in inline functions 2024-11-12 17:11:39 +01:00
Jakub Doka c701eb7b6d
adding extra test 2024-11-12 12:54:36 +01:00
Jakub Doka f1deab11c9
making better peepholes and fixing overoptimization on memory swaps 2024-11-12 12:20:08 +01:00
Jakub Doka f079daa42d
removing redundant loop phys 2024-11-11 23:33:36 +01:00
Jakub Doka 7cac9382ad
we assumed unary operands are at leas 4bytes bit 2024-11-11 23:17:13 +01:00
Jakub Doka ce2f7d2059
fixing negation truncation 2024-11-11 23:02:02 +01:00
Jakub Doka f5f9060803
adding missing instruction selection 2024-11-11 22:36:20 +01:00
Jakub Doka ad7fb5d0fc
adding errors for useless type hints 2024-11-11 22:34:42 +01:00
Jakub Doka d99672b751
fixing too strict assert 2024-11-11 22:14:54 +01:00
Jakub Doka 7def052749
preventing dangling nodes due to cycles in loop phys 2024-11-11 21:55:18 +01:00
Jakub Doka b2eefa5b83
removing assert that can cause crashes 2024-11-11 09:07:36 +01:00
Jakub Doka 3c35557872
fixing type variables in loops 2024-11-11 09:06:34 +01:00
Jakub Doka b6274f3455
fixing yet another edge case 2024-11-10 20:30:35 +01:00
Jakub Doka c61efc3933
adding inline functions 2024-11-10 19:35:48 +01:00
Jakub Doka 654005eea2
updating tests 2024-11-10 18:59:29 +01:00
Jakub Doka 335e6ec20a
fixing nasty aclass clobber priority bug 2024-11-10 18:56:33 +01:00
Jakub Doka 1e02efc1eb
improving load analisys 2024-11-10 17:32:24 +01:00
Jakub Doka 8b98c2ed1b
fixing different file imports 2024-11-10 12:26:30 +01:00
Jakub Doka c353d28be0
fixing another problem with const 2024-11-10 12:03:15 +01:00
Jakub Doka 7865d692a1
fixing the rescheduling edgecase 2024-11-10 11:04:04 +01:00
Jakub Doka 29a23cec0c
removing dbg 2024-11-10 10:30:45 +01:00
Jakub Doka 5dce4df2a1
fixing more stuff 2024-11-10 10:28:02 +01:00
Jakub Doka 42a713aeae
fixing wrong instruction selection 2024-11-10 09:17:43 +01:00
Jakub Doka 823c78bf74
preventing deduplication to cause bugs 2024-11-09 15:14:03 +01:00
Jakub Doka c657084451
integer constants can be casted to floats if type is known to be a float 2024-11-09 14:02:13 +01:00
Jakub Doka 63a1c7feb4
fixing float conversion constant folding 2024-11-09 13:54:08 +01:00
Jakub Doka bedffa9b32
fixing constant fmt newline preservation 2024-11-09 10:58:57 +01:00
Jakub Doka b8032aa840
wrong index for extend 2024-11-09 10:28:53 +01:00
Jakub Doka 65e9f272a8
forgotten dbgs 2024-11-08 23:03:16 +01:00
Jakub Doka d2052cd2a3
adding back the exit code 2024-11-08 21:53:24 +01:00
Jakub Doka 29367d8f8b
fixing compiler pulling function destinations out of the loops 2024-11-08 20:40:18 +01:00
Jakub Doka a299bad75b
adding some simple provenance checks on return values 2024-11-08 11:51:10 +01:00
Jakub Doka 7d48d3beb1
adding constants 2024-11-08 10:57:58 +01:00
Jakub Doka 68c0248189
making type manipulation nicer 2024-11-08 10:25:34 +01:00
Jakub Doka 0ef74d89cb
putting logger back 2024-11-08 08:40:14 +01:00
Jakub Doka 1b2b9f899d
ups 2024-11-08 08:36:13 +01:00
Jakub Doka 455f70db6e
adding better error reporting when compiler crashes errors are now sent trough out buffer 2024-11-08 08:36:00 +01:00
Jakub Doka 0374848b28
fixing formatter not reporting errors 2024-11-07 17:02:22 +01:00
Jakub Doka 513d2c7127
removing log line 2024-11-07 16:42:57 +01:00
Jakub Doka 9d2f419140
fixing messed up flag calculation 2024-11-07 16:39:15 +01:00
Jakub Doka f535ea7b0a
ups, left log lines 2024-11-07 16:05:16 +01:00
Jakub Doka be6d0d3f18
removing wrong graph query in a peephole 2024-11-07 10:47:31 +01:00
Jakub Doka 2718ef8523
ups 2024-11-07 10:43:45 +01:00
Jakub Doka 3ee78f3a31
fixing bugs from the new tests 2024-11-07 10:43:29 +01:00
Jakub Doka 2bac7c1fb3
saving 2024-11-07 08:53:11 +01:00
mlokis 79a3f1ab2b Merge pull request 'various tests' (#23) from koniifer/holey-bytes:trunk into trunk
Reviewed-on: https://git.ablecorp.us/AbleOS/holey-bytes/pulls/23
Reviewed-by: mlokis <mlokis@email.com>
2024-11-07 01:49:56 -06:00
koniifer b15e66b2af test broken sin function 2024-11-06 17:35:01 +00:00
koniifer d2ba7cc101 returning optional issues test 2024-11-06 15:17:03 +00:00
koniifer d3ee72306e optional from eca test 2024-11-05 18:07:04 +00:00
Jakub Doka 87cb77a553
making a Backend trait to separate the different backends we will have in the fucture 2024-11-05 14:52:30 +01:00
Jakub Doka 276d1bb0cf
fixing tab indentation in error messages and depell not displaying errors 2024-11-05 09:41:57 +01:00
Jakub Doka 5cce904135
fixing struct null check on function arguments 2024-11-04 19:57:15 +01:00
Jakub Doka 3338d50672
nasty bug with rescheduled load 2024-11-04 19:18:37 +01:00
Jakub Doka 2e36f32ae0
fixing very sneaky bug 2024-11-04 12:38:47 +01:00
Jakub Doka e8f1d2af8c
allowing 0 idk 2024-11-03 22:54:05 +01:00
Jakub Doka 999b25df8b
adding '_ = <expr>' syntax 2024-11-03 22:27:37 +01:00
Jakub Doka 61250c906a
comparison of non null types to null are now errors 2024-11-03 21:31:46 +01:00
Jakub Doka 44fc9c3e2e
deferring all null checks after the peepholes 2024-11-03 21:13:24 +01:00
Jakub Doka 798000c756
little correction 2024-11-03 10:23:17 +01:00
Jakub Doka 9de631234d
adding unreachable 2024-11-03 10:15:03 +01:00
Jakub Doka 843fbddf3b
loops in inlined functions now work better 2024-11-03 08:59:42 +01:00
Jakub Doka 38a00cbaa0
some start for homemade regalloc 2024-10-31 14:56:55 +01:00
Jakub Doka 4664240e08
eliminating even more todos 2024-10-31 11:10:05 +01:00
Jakub Doka 728d563cea
eliminating more todos 2024-10-31 11:03:58 +01:00
Jakub Doka 56984f08ff
eliminating more todos 2024-10-31 10:56:59 +01:00
Jakub Doka 3f9f99ff65
adding optional values 2024-10-31 10:36:18 +01:00
Jakub Doka 9ed3c7ab9e
saving 2024-10-30 20:20:03 +01:00
Jakub Doka acacd10ee9
microoptimizing bitset 2024-10-30 18:42:25 +01:00
Jakub Doka f6f661cee3
finally struct operators fold into constant 2024-10-30 14:10:46 +01:00
Jakub Doka 4bfb5f192e
fixing the matrix 2024-10-30 13:45:19 +01:00
Jakub Doka ea628c1278
saving 2024-10-29 20:38:33 +01:00
Jakub Doka 7448339605
removing return value temporary optimization sadly 2024-10-29 17:03:00 +01:00
Jakub Doka da7cd5926c
unifiing annoying clobber logic 2024-10-29 15:15:30 +01:00
Jakub Doka 9cf7933251
clobber global loads across functions 2024-10-29 15:04:07 +01:00
Jakub Doka 24b9f9e78b
adding floating point conversions 2024-10-29 14:24:31 +01:00
Jakub Doka 80558ea7e6
adding floating point arithmetic 2024-10-29 13:36:12 +01:00
Jakub Doka 348d9014e3
adding a lot better load elimination 2024-10-29 10:31:52 +01:00
Jakub Doka 30bd6103a6
cleaning up some code 2024-10-29 10:01:37 +01:00
Jakub Doka 97eb985a02
removing specific opts from a fucntion and adding them to the general peepholes 2024-10-29 09:04:49 +01:00
Jakub Doka 7ef1adf7e2
saving 2024-10-28 23:38:57 +01:00
Jakub Doka be828b8c54
properly handling cases when stack is referenced by dofferent part of the memory 2024-10-28 18:53:36 +01:00
Jakub Doka b4b3bae104
fixing storing struct pointer bug 2024-10-28 18:39:42 +01:00
Jakub Doka 33d78fbc52
missing scoping 2024-10-28 17:22:18 +01:00
Jakub Doka be2d38a6d2
making the aliasing analisys bit smarter 2024-10-28 17:19:41 +01:00
Jakub Doka bbd7e12af4
saving this to make sure 2024-10-28 16:18:53 +01:00
Jakub Doka 37db783699
making stack peeps compatible with parallel alias classes 2024-10-28 12:39:26 +01:00
Jakub Doka 948710dc27
fixing an infeence bug 2024-10-28 12:36:46 +01:00
Jakub Doka f0a588fcff
updating test 2024-10-28 11:04:47 +01:00
Jakub Doka 9c32f260a1
cleanup 2024-10-27 21:34:03 +01:00
Jakub Doka 047e1ed15c
adding null 2024-10-27 19:55:11 +01:00
Jakub Doka 2c2f0c048b
cleaning up tests 2024-10-27 19:13:25 +01:00
Jakub Doka 3c12c0e288
removing codegen test outputs 2024-10-27 19:08:22 +01:00
Jakub Doka ca8497550a
updating commandline help 2024-10-27 19:02:13 +01:00
Jakub Doka 849e842336
dont write to the file if the contents did not differ from formatted 2024-10-27 18:56:29 +01:00
Jakub Doka 5c82623db9
removing stuff 2024-10-27 18:37:18 +01:00
Jakub Doka e8a8fa3eb1
simplifiing upcasts, hwich conincidentally allowed more optimizations 2024-10-27 18:21:33 +01:00
Jakub Doka 5926f69e6c
fixing missing upcast 2024-10-27 18:04:50 +01:00
Jakub Doka 83d3fb4919
adding array reformatting rule 2024-10-27 16:07:46 +01:00
Jakub Doka b429534d23
moving hbvm related code into one file 2024-10-27 14:29:14 +01:00
Jakub Doka b187af64a8
removing old compiler 2024-10-27 13:57:00 +01:00
Jakub Doka ce7bb001da
handling infinite loops properly 2024-10-27 11:32:34 +01:00
Jakub Doka 9c90adbfe8
removing idk for scalar values (antipattern) 2024-10-27 00:02:59 +02:00
Jakub Doka db62434736
fixing infinite loop when fetching cycles 2024-10-26 23:24:45 +02:00
Jakub Doka 3d721812f0
adding better dead code elimination 2024-10-26 20:29:31 +02:00
Jakub Doka 5b23a0661b
fixing peep_iter related bugs 2024-10-26 15:18:00 +02:00
Jakub Doka 7c919cd453
fixing nonexistent identifier file mismatch 2024-10-26 14:06:08 +02:00
Jakub Doka bb61526d3e
eliminating more useless stack moves related to return values 2024-10-26 13:43:36 +02:00
Jakub Doka 45e1c6743a
eliminating more useless operations 2024-10-26 12:48:57 +02:00
Jakub Doka 39588579a8
using more typesafe locking 2024-10-26 12:09:53 +02:00
Jakub Doka 9095af6d84
appliing late peepholes 2024-10-26 10:45:50 +02:00
Jakub Doka b62413046d
cleanup 2024-10-26 10:25:42 +02:00
Jakub Doka af4d965b8c
fixed fmt error reporting 2024-10-26 09:53:14 +02:00
Jakub Doka 855da58e06
fixed a binor uga buga 2024-10-26 01:07:35 +02:00
Jakub Doka 2fc24f0f58
ups 2024-10-26 00:37:39 +02:00
Jakub Doka 8016b1fad5
adding rescheduling 2024-10-26 00:34:22 +02:00
Jakub Doka 46f9903562
adding error log when compiler crashes 2024-10-25 23:05:43 +02:00
Jakub Doka 517850f283
fixing undescriptive error or not enough arguments 2024-10-25 22:59:01 +02:00
Jakub Doka faa8dd2e6f
adding pointer checks on ecas 2024-10-25 16:33:56 +02:00
Jakub Doka d23d010917
fixing eror message 2024-10-25 16:08:20 +02:00
Jakub Doka b1da36ecde
fixing upcasting signed to unsigned 2024-10-25 15:45:00 +02:00
Jakub Doka e62aab9b4b
adding better binaro operator errors positions 2024-10-25 15:40:23 +02:00
Jakub Doka 423361a80e
forgottend typecheck on a struct 2024-10-25 15:31:49 +02:00
Jakub Doka 62a7c61cdc
properly selecting li instructions for integer sizes 2024-10-25 15:29:17 +02:00
Jakub Doka 2bab16d3ce
making never type cause less errors 2024-10-25 15:14:32 +02:00
Jakub Doka c88daa4800
adding better negative number inference 2024-10-25 15:07:39 +02:00
Jakub Doka 6988d8893f
changing uint to be the default 2024-10-25 14:51:33 +02:00
Jakub Doka 64e228450f
little cleanup and fixing error recovery 2024-10-25 11:29:54 +02:00
Jakub Doka 897e121eeb
fixing stack alloc overoptimization 2024-10-24 19:57:36 +02:00
Jakub Doka 648bd24d0d
forgot to mul by 8 2024-10-24 16:26:28 +02:00
Jakub Doka aefa7e6405
forgot 2024-10-24 15:49:41 +02:00
Jakub Doka 026f6141e6
forgot 2024-10-24 15:45:16 +02:00
Jakub Doka cb88edea1f
fixing overoptimization of load -> store 2024-10-24 15:39:38 +02:00
Jakub Doka 127e8dcb38
fixing @as misbehaving 2024-10-24 14:10:07 +02:00
Jakub Doka 9c43dafcf5
fixing @as misbehaving 2024-10-24 14:08:17 +02:00
Jakub Doka e65dbcfcbe
fixing bitcasts 2024-10-24 13:58:58 +02:00
Jakub Doka e0d4955bd5
fixing small struct return 2024-10-24 13:25:30 +02:00
Jakub Doka 78ebc3292c
removing useless clobbers 2024-10-24 12:28:18 +02:00
Jakub Doka 0c2db878f0
adding the stack optimizations 2024-10-24 10:21:10 +02:00
Jakub Doka cb9d7f7d1e
okay now it works 2024-10-24 09:43:07 +02:00
Jakub Doka 41b70bec43
should work better 2024-10-23 12:26:07 +02:00
Jakub Doka f013e90936
better somehow 2024-10-22 22:57:40 +02:00
Jakub Doka 6977cb218c
seems to be compiling 2024-10-22 16:54:32 +02:00
Jakub Doka 3f30735eaa
seems to be compiling 2024-10-22 16:53:48 +02:00
Jakub Doka 58f4837ae0
eliminating important todo 2024-10-22 16:03:23 +02:00
Jakub Doka b95bddac7b
ups 2024-10-22 12:57:49 +02:00
Jakub Doka 7d53706e71
adding --optimize flag to the compiler 2024-10-22 12:50:54 +02:00
Jakub Doka 4d699fcbf1
strinc operatos seem to work now 2024-10-22 12:40:41 +02:00
Jakub Doka 5aa6150c70
now the generic types work too 2024-10-22 10:17:16 +02:00
Jakub Doka b0a85f44c9
fixing some bugs and making the generic types work, well not quite 2024-10-22 10:08:50 +02:00
Jakub Doka 2aa5ba9abc
generic functions work now 2024-10-22 07:20:08 +02:00
Jakub Doka 35d34dca54
sweeping trought more tests 2024-10-21 19:57:55 +02:00
Jakub Doka bc817c4ea2
implementing directives 2024-10-21 18:57:23 +02:00
Jakub Doka 0298b32e38
sniping a peephole 2024-10-21 17:29:11 +02:00
Jakub Doka 73c9ccef6a
simplifing code patterns and sixing argument passing 2024-10-21 17:04:29 +02:00
Jakub Doka ad4aed9c98
fixing loop bugs and some optimization edgecases 2024-10-21 15:12:37 +02:00
Jakub Doka 8528bef8cf
adding more tests, fixing pointer math, and integer upcasting 2024-10-20 21:50:08 +02:00
Jakub Doka 11c8755b18
implementing wide returns and adding integer upcast ops 2024-10-20 21:00:56 +02:00
Jakub Doka d5c90b95a7
committy committy 2024-10-20 18:50:10 +02:00
Jakub Doka 1da900461c
fixing struct return and copy miscompilation 2024-10-20 18:49:41 +02:00
Jakub Doka 3aff6fc006
reorganizing the type parser trait 2024-10-20 16:43:25 +02:00
Jakub Doka ccfde6c237
adding more tests and organizing things 2024-10-20 15:33:32 +02:00
Jakub Doka 44c4b71bb3
unifiing the type resolution into a trait 2024-10-20 15:16:55 +02:00
Jakub Doka c3a6e62bf2
implementing strings 2024-10-20 12:22:28 +02:00
Jakub Doka 00949c4ea8
implementing global variables 2024-10-20 10:37:48 +02:00
Jakub Doka 15e4762d4a
cleanup: 2 2024-10-19 19:53:43 +02:00
Jakub Doka 959bfd7f76
cleanup: 1 2024-10-19 19:37:02 +02:00
Jakub Doka 6ad0b41759
fixing code scheduling bugs 2024-10-19 10:17:36 +02:00
Jakub Doka 89cc611f7a
good direction 2024-10-18 16:57:00 +02:00
Jakub Doka cf74fdd99c
adding loops and seeing they totally not work 2024-10-18 16:52:54 +02:00
Jakub Doka 58578dd4b2
adding loops and seeing they totally not work 2024-10-18 16:51:54 +02:00
Jakub Doka 4a7b4e4ead
handling conditional stores 2024-10-18 13:11:11 +02:00
Jakub Doka c900f4ef5c
removing offset from stores and loads 2024-10-18 09:52:50 +02:00
Jakub Doka 3a494147ec
reorganizing tests 2024-10-18 08:43:50 +02:00
Jakub Doka 4336fec653
structs work with optimizations 2024-10-17 22:29:09 +02:00
Jakub Doka 11f6537a09
foo 2024-10-17 19:32:10 +02:00
Jakub Doka da58a5926d
removing some old garbage 2024-10-17 16:08:29 +02:00
Jakub Doka f5ef62c6bb
fixing nasty bug 2024-10-17 15:48:22 +02:00
Jakub Doka f386c332e5
adding link to the hosted page 2024-10-15 12:48:35 +02:00
Jakub Doka 23b90b3dd7
adding tab highlight and some more details in readme 2024-10-15 12:46:36 +02:00
Jakub Doka ea736d8824
lota of progress 2024-10-14 22:04:18 +02:00
Jakub Doka dc2e0cc5b3
implementing dependency fetching 2024-10-14 13:25:38 +02:00
Jakub Doka c9b85f9004
fixing sizeof not storing values 2024-10-13 22:24:57 +02:00
Jakub Doka af147b3cb6
adding test for embed 2024-10-13 20:01:18 +02:00
Jakub Doka 0f8a720fe8
wrong handling of embeds 2024-10-13 16:38:51 +02:00
Jakub Doka 2ab6f6c914
resolving shadoving in inlined functions correctly 2024-10-13 16:16:27 +02:00
Jakub Doka 54d93608aa
correctly implementing big block copies 2024-10-13 15:49:14 +02:00
Jakub Doka 19a6cdd764
implementing embeds 2024-10-13 15:33:57 +02:00
Jakub Doka 2660d976fe
fixing arithmetic bug 2024-10-13 15:22:16 +02:00
Jakub Doka 659ccbd637
adding stricter typechecking 2024-10-13 14:11:17 +02:00
Jakub Doka 3a2367f24f
fixing some bugs 2024-10-13 13:30:00 +02:00
Jakub Doka 0f4ff918d2
fixing bugs and improving memory consumption 2024-10-13 12:25:12 +02:00
Jakub Doka 6d7e726066
fixing generic function inlining 2024-10-12 22:29:52 +02:00
Jakub Doka 9e65f3949d
fixing arg datarace 2024-10-12 22:21:20 +02:00
Jakub Doka bf00dc85b2
progres on inline generic funcs and bettere error message 2024-10-12 22:03:53 +02:00
Jakub Doka 69b58c2b36
progress 2024-10-12 21:25:37 +02:00
Jakub Doka 5364b66629
adding gzip to static files 2024-10-12 15:04:58 +02:00
Jakub Doka c4826d3bfd
fixing feature incompatibility 2024-10-12 13:07:49 +02:00
Jakub Doka 07638caff0
starting the compiler wasm 2024-10-10 19:01:12 +02:00
Jakub Doka 5ef1ec4811
removing obvious temporary allocation 2024-10-10 16:08:03 +02:00
Jakub Doka f0ae65606d
renaming directories, reducing temporary allocations during parsing 2024-10-10 15:48:08 +02:00
Jakub Doka a538c0ddb0
optimizing ast allocation 2024-10-10 13:04:17 +02:00
Jakub Doka c31d1dcb9c
more size opts 2024-10-10 09:51:03 +02:00
Jakub Doka 54a7f85978
progress 2024-10-10 08:35:17 +02:00
Jakub Doka e200c2fc98
lock 2024-10-09 00:17:48 +02:00
Jakub Doka 1626734c1a
some progress 2024-10-09 00:17:13 +02:00
Jakub Doka 13f63c7700
cleanup 2024-10-06 10:51:33 +02:00
Jakub Doka c7dbe1c43d
adding some starter code to the depell 2024-10-05 23:07:33 +02:00
Jakub Doka 4c15f61cb7
something idk 2024-10-04 21:44:29 +02:00
Jakub Doka f1ea01ef0c
unifiing context key maps 2024-10-01 22:53:03 +02:00
Jakub Doka 2361e166cd
save 2024-10-01 21:39:23 +02:00
Jakub Doka 4d913462cb
save 2024-10-01 21:36:23 +02:00
Jakub Doka bdc2c43773
save 2024-10-01 21:33:30 +02:00
Jakub Doka b2254e9820
making compiler a bit smarter about evaluating types (more shortcuts) 2024-10-01 17:43:15 +02:00
Jakub Doka d293e02f62
relaxing by optimizing the compiler 2024-10-01 15:28:18 +02:00
Jakub Doka 1ee8d464c6
fixing infinite recusrion bug 2024-10-01 14:00:41 +02:00
Jakub Doka 2a4d27d8e6
tweak 2024-09-30 22:47:00 +02:00
Jakub Doka 1f5846afaa
fixing struct type display 2024-09-30 22:17:54 +02:00
Jakub Doka 006bc80f12
fixing duplicate structs 2024-09-30 22:15:40 +02:00
Jakub Doka 802e8b5d55
fixing a leaked reg bug when constant happens 2024-09-30 21:55:34 +02:00
Jakub Doka 6b7572f089
logging bugfix 2024-09-30 19:41:52 +02:00
Jakub Doka 1d04287532
separating the cli parsing 2024-09-30 19:35:25 +02:00
Jakub Doka 8b6d9b5de3
transitioning to log crate 2024-09-30 19:27:00 +02:00
Jakub Doka 136bba1631
some cleanup 2024-09-30 19:14:00 +02:00
Jakub Doka c1b00b6d6b
making nostd compat work 2024-09-30 19:09:17 +02:00
Jakub Doka a51b23187d
making a little utility for computing struct layouts 2024-09-28 21:56:39 +02:00
Jakub Doka c3f9e535d3
cleaning up tests 2024-09-28 16:34:08 +02:00
Jakub Doka 6d805dc2ec
fixing the cli 2024-09-28 16:28:05 +02:00
Jakub Doka 4291ebc25e
forgot 2024-09-28 15:14:17 +02:00
Jakub Doka 02c74a181d
making some pointer peepholes 2024-09-28 15:13:32 +02:00
Jakub Doka c0d4464097
implementing pointers example 2024-09-27 16:53:28 +02:00
mlokr 602249a48a
adding packed structs 2024-09-22 18:17:30 +02:00
mlokr 338e3f1519
fixing compiler errors 2024-09-21 14:46:12 +02:00
mlokr 0e9f4402cb
fixing hbbytecode having the std mentioned 2024-09-21 08:19:27 +02:00
mlokr 6057e88034
fixing a bug and preparing form memory manipulation with optimizations 2024-09-20 19:01:44 +02:00
mlokr 2a3d077476
fixing all supidus bugs 2024-09-20 16:37:51 +02:00
mlokr 8e62bd747b
fixing some other stuff that nerfs the code a bit (a lot) 2024-09-20 12:03:24 +02:00
mlokr b8ff503c14
fixing report bug 2024-09-20 11:01:10 +02:00
mlokr 9e69e53e24
merge 2024-09-20 08:20:48 +02:00
mlokr 4d163a2313
bruhma 2024-09-20 08:09:29 +02:00
mlokr e4e7f8d5b5
implementing ableos executable format 2024-09-19 13:40:03 +02:00
mlokr 4849807353
removing git support and relative path prefix which did nothing anyway 2024-09-18 10:34:07 +02:00
mlokr 6e30968c54
improving one particular error message 2024-09-18 10:14:17 +02:00
mlokr 6fc0eb3498
triing to turn absolute to relative paths in error messages 2024-09-18 10:07:40 +02:00
mlokr 98dfd6b09c
improving parser error messages 2024-09-18 09:47:52 +02:00
mlokr ece9bb8bf2
eca now infers the return type 2024-09-17 18:11:07 +02:00
mlokr 09fcbbc03b
forcing structs to always be on stack 2024-09-17 18:07:15 +02:00
mlokr a7fda408ef
forgot update 2024-09-17 17:59:32 +02:00
mlokr 5d77ae93b4
forcing structs to always be on stack 2024-09-17 17:59:03 +02:00
mlokr 4a9b9de87f
nah, lets use dummer code 2024-09-17 15:50:45 +02:00
mlokr bba3570788
adding wide return move for wider range of cases 2024-09-17 15:47:23 +02:00
mlokr 6852452f1a
fixing wide return value 2024-09-17 15:25:31 +02:00
mlokr 254d5ed962
fixing array bug, well actually more serious bug that somehow did not happen until now 2024-09-16 21:46:02 +02:00
mlokr faf068885a
replacing magic zeroes with proper constants 2024-09-16 15:49:27 +02:00
mlokr a2e864360e
removing comand line parsing library that is used for tool that anybody can read to see how to use it 2024-09-16 15:27:38 +02:00
mlokr 79e4cead2d
making many tests work 2024-09-15 20:14:56 +02:00
mlokr 6968e7d769
adding framerk to add comments to different places 2024-09-14 12:27:53 +02:00
mlokr c133c2dbe7
adding negation 2024-09-14 11:26:54 +02:00
mlokr 2bc7a5c13f
bratenburg 2024-09-13 20:31:05 +02:00
mlokr 16e2c32521
brah 2024-09-13 19:30:47 +02:00
mlokr da85d91a09
moving op instruction selection to token methods 2024-09-13 18:41:01 +02:00
mlokr e2a8373c42
updating tests 2024-09-13 18:23:00 +02:00
mlokr fbdabd8314
binary no longer contains comptime code and inoptimized impl is grately simplified 2024-09-13 18:22:27 +02:00
mlokr 39c4526797
saving 2024-09-13 15:12:20 +02:00
mlokr 2e3fbfa966
extracting testing logic 2024-09-13 14:30:23 +02:00
mlokr eebabc5070
accidente 2024-09-13 14:16:34 +02:00
mlokr b177cbe7c7
instruction scheduling somewhat works now 2024-09-13 14:15:45 +02:00
mlokr 641d344d2d
removing false positives 2024-09-12 18:42:21 +02:00
mlokr dc418bd5e0
blah buh leee 2024-09-10 20:54:11 +02:00
mlokr 8bbc40b9b1
fixing the compilation eror and maybe breaking the code 2024-09-10 20:50:36 +02:00
mlokr 8083bcb0e8
fixing the overoptimization 2024-09-10 16:32:15 +02:00
mlokr 8928888481
fixing the useles register alloc when loading 2024-09-10 12:16:42 +02:00
mlokr d64fa7e1f9
more oportunities to reduce register copies 2024-09-10 12:13:01 +02:00
mlokr b51f964cae
optimizing accumulation 2024-09-10 01:15:18 +02:00
mlokr 67b8ffe2f2
prolly fixed 2024-09-09 23:31:22 +02:00
mlokr 32bed04914
fixing vm bug 2024-09-09 22:52:34 +02:00
mlokr 6cb9489e9a
saving 2024-09-09 22:17:54 +02:00
mlokr 73727c2383
fixing more bugs and also adding uninig memory and also optimizing cong jumps 2024-09-09 19:36:53 +02:00
mlokr e8a5027cab
fixing the obscure string allocation bug 2024-09-08 17:25:33 +02:00
mlokr 50f3350418
saving 2024-09-08 17:11:33 +02:00
mlokr bb41da484f
adding global mutatuon to the test 2024-09-08 12:00:07 +02:00
mlokr ee30069195
switching to more optimal lookup and adding dynamic input array 2024-09-08 04:20:10 +02:00
mlokr 58c1c29293
not traversing controlfow can save us time 2024-09-08 03:15:05 +02:00
mlokr 49387dbe16
if a phy does not depend of different phy in the same loop we can modify it in place saving a register copy 2024-09-08 03:12:57 +02:00
mlokr 803095c0c5
implementing multiple breaks 2024-09-06 22:00:23 +02:00
mlokr 514c2fe630
ups 2024-09-06 18:50:53 +02:00
mlokr b4f64656fe
better error recovery 2024-09-06 18:50:28 +02:00
mlokr 73e13bd93c
more tests work now 2024-09-06 16:16:42 +02:00
mlokr b404e5b86d
more tests work now 2024-09-06 16:11:57 +02:00
mlokr 4bcab25231
upgraded error messages and inference 2024-09-06 02:42:07 +02:00
mlokr 414a07b99a
great 2024-09-06 02:04:19 +02:00
mlokr fdf4cccde0
making a mess 2024-09-06 01:17:54 +02:00
mlokr 1a3b0c2eec
implementing optimizations up to is statements 2024-09-05 11:16:11 +02:00
mlokr 955e4a5c7a
making functions example pass 2024-09-04 23:46:32 +02:00
mlokr d9aab2191b
maybe fixing a bug 2 2024-09-04 18:53:07 +02:00
mlokr 9dd09b2122
maybe fixing a bug 2024-09-04 18:51:56 +02:00
mlokr 937c107dec
updating tests and fixing bug 2024-09-04 18:48:25 +02:00
mlokr ed1b9459fc
some more 2024-09-04 18:43:08 +02:00
mlokr f063d0a4fd
disasm now displays literal string value 2024-09-04 18:38:32 +02:00
mlokr a21dee61e7
adding disasm option 2024-09-04 17:56:59 +02:00
mlokr 3807276a55
fixing integer parsing bug 2024-09-04 17:13:43 +02:00
mlokr 894f73ca35
adding more type checking 2024-09-04 16:54:34 +02:00
mlokr 00ad474881
making tests more robust for no reason 2024-09-04 02:35:09 +02:00
mlokr 9e0e0242aa
preparing for dead code elemination 2024-09-03 22:41:44 +02:00
mlokr a31e02449c
adding disasembler 2024-09-03 22:34:17 +02:00
mlokr b956cc78bb
adding bfn to ast nodes 2024-09-03 17:51:28 +02:00
mlokr 7279ed88e9
adding sime ignores 2024-09-03 00:27:50 +02:00
mlokr 9500db8764
some organization 2024-09-03 00:07:20 +02:00
mlokr 9404eb32a2
what 2024-09-02 13:28:07 +02:00
mlokr f172c33247
fixing nasety bug 2024-09-02 04:45:42 +02:00
mlokr 75dca64648
popping the inlined function arguments 2024-09-02 03:56:22 +02:00
mlokr 97c62e424a
fixing inconststent context for function 2024-09-02 03:37:49 +02:00
mlokr a2c08b6ef6
fixing inlining related bugs 2024-09-02 03:21:39 +02:00
mlokr a78d2bc3e9
small fix 2024-09-02 02:45:03 +02:00
mlokr ad3fc1190c
uptimizing a bit more, inline calls 2024-09-02 02:38:11 +02:00
mlokr 641be15703
most likely fixed 2024-09-01 23:51:59 +02:00
mlokr cbe6f98dff
maybe fixed 2024-09-01 23:14:48 +02:00
mlokr 9bdacfffb2
ups 2024-09-01 22:08:17 +02:00
mlokr f13f500d6e
fixing minmaxing bug 2024-09-01 22:07:45 +02:00
mlokis 4e9d6094bd Merge pull request 'Support Hex, Binary, and Octal number literals' (#20) from koniifer/holey-bytes:trunk into trunk
Reviewed-on: https://git.ablecorp.us/AbleOS/holey-bytes/pulls/20
2024-09-01 19:19:10 +00:00
mlokr 28e33d11c9
removing garbage 2024-09-01 21:15:29 +02:00
koniifer 581c4d531c fix obvious bugs, add unit test, hopefully preserve radix when formatting 2024-09-01 20:07:19 +01:00
mlokr 9012f976c5
bruh 2024-09-01 21:05:17 +02:00
koniifer 27462d9a33 forgot octals exist 2024-09-01 18:45:38 +01:00
koniifer 781c40ede0 support hex, octal, binary literals 2024-09-01 18:42:04 +01:00
mlokr 9af7bf559f
fixed confused shift tokens (I still dont know which side is left) 2024-07-21 10:29:58 +02:00
mlokr 5a6474f066
fixed nasty wrong scope bug 2024-07-20 18:52:24 +02:00
mlokr 33a4bf7d01
barbar 2024-07-19 21:19:03 +02:00
mlokr cac99cd34d
foobar 2024-07-19 21:04:22 +02:00
mlokr 5555b9865a
v 2024-07-19 15:51:02 +02:00
mlokr f964520641
u 2024-07-19 14:39:30 +02:00
mlokr a88d3a5c9d
t 2024-07-19 14:24:58 +02:00
mlokr 416f646957
s 2024-07-19 14:17:45 +02:00
mlokr 12b39c5b3f
q 2024-07-19 14:05:13 +02:00
mlokr 4dcaae8362
p 2024-07-19 13:51:38 +02:00
mlokr ab903fa4ea
o 2024-07-19 13:44:35 +02:00
mlokr c48a2d2799
l 2024-07-19 13:39:42 +02:00
mlokr fb01407465
k 2024-07-19 13:38:30 +02:00
mlokr 71359d82aa
j 2024-07-19 13:21:14 +02:00
mlokr 29d5774c47
i 2024-07-19 13:09:45 +02:00
mlokr 434acfbc7b
i 2024-07-19 13:09:07 +02:00
mlokr 6a03f125a5
h 2024-07-19 13:03:11 +02:00
mlokr 03aedb5d3f
g 2024-07-19 13:02:00 +02:00
mlokr a1179f3320
f 2024-07-19 12:56:40 +02:00
mlokr ba73a89171
e 2024-07-19 12:52:11 +02:00
mlokr 523ca6d103
d 2024-07-19 12:51:26 +02:00
mlokr 654b7eb7af
d 2024-07-19 12:42:06 +02:00
mlokr 4c3b63df25
c 2024-07-19 12:39:19 +02:00
mlokr 9a8a56fe97
b 2024-07-19 12:34:22 +02:00
mlokr aeb3a37f7d
fixing typechecking issues, pointers now properly typecheck and generic types are properly cached 2024-07-19 12:00:55 +02:00
mlokr 3c01a40ef2
fixing wide register returns 2024-07-18 17:55:55 +02:00
mlokr 4f9d4f2e71
arrays work i guess 2024-07-08 18:08:58 +02:00
mlokr 25bbe247e9
added example for struct patters 2024-07-08 11:00:35 +02:00
mlokr ab41d49a3d
some stuff 2024-07-08 10:14:38 +02:00
mlokr 11cb875882
some stuff 2024-07-08 10:13:50 +02:00
mlokr 8984dce0e7
more dependencies removed 2024-07-08 07:44:06 +02:00
mlokr fd64968f3a
more garbage 2024-07-08 07:27:40 +02:00
mlokr e00f2f08c8
removing garbage 2024-07-08 07:22:53 +02:00
mlokr 880cd66c66
on the journey twards struct destruct 2024-07-07 19:16:15 +02:00
mlokr fa41c56cb3
interstellar 2024-07-07 18:21:07 +02:00
mlokr efa7271a59
brahmaputra 4 2024-07-07 14:52:31 +02:00
mlokr bd7384123c
brahmaputra 2 2024-07-07 14:29:37 +02:00
mlokr e9589ebcae
brahmaputra 2024-07-07 14:26:43 +02:00
mlokr 22f925b3f5
bruh 2024-07-07 13:44:20 +02:00
mlokr 3807fe22da
YEEEEEEEEEEEEEEEEEEEEEEEEEEEES 2024-07-07 13:42:48 +02:00
mlokr 12c7467be2
removing string optimizations 2024-07-07 12:15:48 +02:00
mlokr cdc8cb35f7
GAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAH, I did not delete unwanted relocations 2024-07-06 23:02:49 +02:00
mlokr 36bd1a796b
GAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAH, I did not delete unwanted relocations 2024-07-06 23:02:04 +02:00
mlokr 59705c062d
adding better api 2024-07-06 15:18:57 +02:00
mlokr 9fe734c68c
adding better api 2024-07-06 15:05:56 +02:00
mlokr dc0562553d
adding better api 2024-07-06 14:58:50 +02:00
mlokr 91907a90ff
-__- 2024-07-02 14:49:05 +02:00
mlokr e147358fce
adding spart fmt to struct ctors 2024-06-25 21:51:41 +02:00
mlokr f9e46b4641
added formatting 2024-06-25 21:41:12 +02:00
mlokr 93deeee6b9
adding comments 2024-06-25 19:55:25 +02:00
mlokr 876690319f
slight improvement 2024-06-25 19:46:48 +02:00
mlokr c835317287
slight improvement 2024-06-25 19:22:49 +02:00
mlokr 8442b55aa6
improved exer 2024-06-25 19:13:42 +02:00
mlokr e07265c88b
improved exer 2024-06-25 19:12:35 +02:00
mlokr 6a69042cb7
improved exer 2024-06-25 18:39:59 +02:00
mlokr c85437e4e8
fixed a significant bugus 2024-06-24 17:45:58 +02:00
mlokr 76b3f9ff4b
pekomaaaaa 2024-06-24 17:26:00 +02:00
mlokr 66c3f7b0d4
cleanup 2024-06-23 13:55:48 +02:00
mlokr b04d9e517e
tests pass again 2024-06-23 09:26:03 +02:00
mlokr b46c64db4f
tests pass again 2024-06-23 09:09:33 +02:00
mlokr 6de8496aa5
whew 2024-06-21 23:07:32 +02:00
mlokr 499fe34f1d
psl 2024-06-20 11:23:37 +02:00
mlokr 36d978d798
cleaning up the docs 2024-06-20 11:18:36 +02:00
mlokr bd2a49d29a
brah 2024-06-15 10:49:02 +02:00
mlokr 1c8645bf11
fixing pre 2024-06-15 10:48:42 +02:00
mlokr 1624559e7b
little guide to add examples 2024-06-15 10:46:53 +02:00
mlokr 1ca5d89644
pup 2024-06-15 10:38:16 +02:00
mlokr 61ecbbd304
putting tests as examples in readme 2024-06-15 10:37:50 +02:00
mlokr 002a7df509
adding more elaborate directive example 2024-06-15 09:37:19 +02:00
mlokr 20903ef294
smh my head 2024-06-12 16:29:41 +02:00
mlokr aafcb2fbbd
size improvement 2024-06-06 15:39:10 +02:00
mlokr 98862edd58
other stuff 2024-06-01 20:30:15 +02:00
mlokr b9de362ba2
making modules work 2024-06-01 20:30:15 +02:00
mlokr e494785f93 some stuff 2024-05-20 14:11:58 +02:00
mlokr aef9951bc5 implementing comptime constants 2024-05-19 18:20:42 +02:00
mlokr b922dbd232 making progress on parallelization 2024-05-17 19:53:59 +02:00
mlokr 71c4d3632a welp that was an accident 2024-05-16 22:56:53 +02:00
mlokr 8cb9f2eaac fixing stack to what compiler assumes 2024-05-16 16:54:12 +02:00
mlokr aae217dd00 making better use of parameter and return registers (use register 2 for arguments when possible) 2024-05-16 16:50:29 +02:00
mlokr 4502a64514 some cleanup and bug fix 2024-05-16 13:32:04 +02:00
mlokr ca1d471646 adding imm operations is come cases 2024-05-16 13:29:16 +02:00
mlokr 2dff9f7244 slightly optimizing assignment 2024-05-16 12:56:33 +02:00
mlokr 3127d04e41 doing same for arguments 2024-05-16 12:42:11 +02:00
mlokr 589a30c8a3 making the variables smarter (only putting then on stack when they get referenced) 2024-05-16 12:23:37 +02:00
mlokr 8b81cfef37 cargo update 2024-05-15 14:39:36 +02:00
mlokr 6b74640c3f accident 2024-05-15 14:37:03 +02:00
mlokr 87ba7aa203 removing deendence on macros with a simple build script 2024-05-15 14:36:38 +02:00
mlokr 78f9eb6acc implementing codegen for all the operators 2024-05-15 11:10:20 +02:00
mlokr 3c09a5f23e adding '<op>=' syntax 2024-05-15 10:37:39 +02:00
mlokr 70955c1792 adding directives 2024-05-14 23:07:32 +02:00
mlokr d8a922df26 painfully, but suddle bugs with pointers are now fixed 2024-05-14 15:03:36 +02:00
mlokr 9aa5da82c9 saving before refactoring experiment 2024-05-14 14:01:40 +02:00
mlokr fb481a0600 making stack reclamation 2024-05-14 12:17:39 +02:00
mlokr d90f386bd2 adding an edge case 2024-05-13 15:27:09 +02:00
mlokr c14e6c352d fixing upcating bugs related to pointers 2024-05-13 15:24:48 +02:00
mlokr 9ccf91d072 adding pointer arithmetic test 2024-05-13 14:23:19 +02:00
mlokr 7cca9a3683 adding different-sized integers 2024-05-13 13:36:29 +02:00
mlokr b28baa86f7 adding destination semantics for expressions 2024-05-13 11:05:35 +02:00
mlokr 2226a47aaa fixing struct regurns 2024-05-13 09:55:09 +02:00
mlokr 0aec47e985 fixing struct regurns 2024-05-13 09:38:33 +02:00
mlokr 5c38115119 adding simple cli 2024-05-13 00:02:32 +02:00
mlokr c3cbd054f7 improving code 2024-05-12 23:45:28 +02:00
mlokr 06e30529bf added better error reports 2024-05-12 23:19:45 +02:00
mlokr 4ec635dc56 proper type display and pointer types 2024-05-12 22:40:28 +02:00
mlokr a08856a464 fising spec 2024-05-12 20:14:46 +02:00
mlokr d5a5c932e7 finishing structures 2024-05-12 20:10:50 +02:00
mlokr bc59886428 adding per function dead code elimination 2024-05-12 14:56:59 +02:00
mlokr f87959aacb adding structures 2024-05-12 13:13:36 +02:00
mlokr 80b05779ea adding struct syntax 2024-05-12 12:21:40 +02:00
mlokr 4bb5ec1953 adding struct syntax 2024-05-12 12:16:40 +02:00
mlokr 2aa315a863 accident 2024-05-12 11:53:21 +02:00
mlokr 86013a50a4 identifiers are now properly checked 2024-05-12 11:52:58 +02:00
mlokr 465b185452 foo bar 2024-05-11 22:23:58 +02:00
mlokr b794fa7c3c foo bar 2024-05-11 22:22:12 +02:00
able ebefc85566 :3 Breaky 2024-05-11 15:21:07 -05:00
mlokr a3c4b878b2 adding loops 2024-05-11 18:16:27 +02:00
mlokr 7f32e7775c making if statements without else branch work 2024-05-11 17:05:22 +02:00
mlokr 1d74f27b0e making the functions kind of walk 2024-05-11 16:04:13 +02:00
mlokr 7435218999 fixing relative jumps to not offset from immidiate adress but from instruction adress 2024-05-11 12:51:32 +02:00
mlokr cf99091a45 now compiling some trivial arithmetic 2024-05-10 22:54:12 +02:00
mlokr 81952cfc40 compiling return stmt 2024-05-10 21:38:15 +02:00
mlokr 68d53544fd compiling return stmt 2024-05-10 21:33:42 +02:00
mlokr aa77a2f822 fixing JALA and JAL saving self reference instead of reference to the next instruction 2024-05-10 15:29:11 +02:00
mlokr b80528bfd7 accident 2024-05-09 23:43:18 +02:00
mlokr 1c08148dc9 starting from zero again 2024-05-09 23:41:59 +02:00
mlokr 774735b515 ups 2024-05-09 18:24:30 +02:00
mlokr 870c1f4718 blah 2024-05-09 18:22:31 +02:00
mlokr 326adf47ce wha 2024-05-09 18:18:19 +02:00
178 changed files with 22392 additions and 1739 deletions

View file

@ -1,2 +1,4 @@
[alias]
xtask = "r -p xtask --"
wasm-build = "b --target wasm32-unknown-unknown --profile=small -Zbuild-std=core,alloc -Zbuild-std-features=optimize_for_size,panic_immediate_abort -p"
wasm-build-debug = "b --target wasm32-unknown-unknown --profile=small-dev -Zbuild-std=core,alloc -Zbuild-std-features=optimize_for_size -p"

12
.gitignore vendored
View file

@ -1 +1,13 @@
# garbage
/target
rustc-ice-*
# sqlite
db.sqlite
db.sqlite-journal
# assets
/depell/src/*.gz
/depell/src/*.wasm
#**/*-sv.rs
/bytecode/src/instrs.rs

1573
Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -1,3 +1,50 @@
cargo-features = ["profile-rustflags"]
[workspace]
resolver = "2"
members = ["hbasm", "hbbytecode", "hbvm", "hbxrt", "xtask"]
members = [
"bytecode",
"vm",
"xrt",
"xtask",
"lang",
"depell",
"depell/wasm-fmt",
"depell/wasm-hbc",
"depell/wasm-rt",
]
[workspace.dependencies]
hbbytecode = { path = "bytecode", default-features = false }
hbvm = { path = "vm", default-features = false }
hbxrt = { path = "xrt" }
hblang = { path = "lang", default-features = false }
hbjit = { path = "jit" }
[profile.release]
lto = true
#debug = true
strip = true
codegen-units = 1
panic = "abort"
[profile.small]
rustflags = ["-Zfmt-debug=none", "-Zlocation-detail=none"]
inherits = "release"
opt-level = "z"
strip = "debuginfo"
lto = true
codegen-units = 1
panic = "abort"
[profile.small-dev]
inherits = "dev"
opt-level = "z"
strip = "debuginfo"
panic = "abort"
[profile.fuzz]
inherits = "dev"
debug = true
opt-level = 3
panic = "abort"

View file

@ -3,6 +3,8 @@ name = "hbbytecode"
version = "0.1.0"
edition = "2018"
[dependencies]
paste = "1.0.14"
with_builtin_macros = "0.0.3"
[features]
default = ["disasm"]
std = []
disasm = ["std"]

204
bytecode/build.rs Normal file
View file

@ -0,0 +1,204 @@
#![feature(iter_next_chunk)]
use std::{collections::HashSet, fmt::Write};
fn main() -> Result<(), Box<dyn std::error::Error>> {
println!("cargo:rerun-if-changed=build.rs");
println!("cargo:rerun-if-changed=instructions.in");
let mut generated = String::new();
gen_instrs(&mut generated)?;
std::fs::write("src/instrs.rs", generated)?;
Ok(())
}
fn gen_instrs(generated: &mut String) -> Result<(), Box<dyn std::error::Error>> {
writeln!(generated, "#![expect(dead_code)]")?;
writeln!(generated, "use crate::*;")?;
'_opcode_structs: {
let mut seen = HashSet::new();
for [.., args, _] in instructions() {
if !seen.insert(args) {
continue;
}
writeln!(generated, "#[derive(Clone, Copy, Debug)]")?;
writeln!(generated, "#[repr(packed)]")?;
write!(generated, "pub struct Ops{args}(")?;
let mut first = true;
for ch in args.chars().filter(|&ch| ch != 'N') {
if !std::mem::take(&mut first) {
write!(generated, ",")?;
}
write!(generated, "pub Op{ch}")?;
}
writeln!(generated, ");")?;
writeln!(generated, "unsafe impl BytecodeItem for Ops{args} {{}}")?;
}
}
'_max_size: {
let max = instructions()
.map(
|[_, _, ty, _]| {
if ty == "N" {
1
} else {
iter_args(ty).map(arg_to_width).sum::<usize>() + 1
}
},
)
.max()
.unwrap();
writeln!(generated, "pub const MAX_SIZE: usize = {max};")?;
}
'_encoders: {
for [op, name, ty, doc] in instructions() {
writeln!(generated, "/// {}", doc.trim_matches('"'))?;
let name = name.to_lowercase();
let args = comma_sep(
iter_args(ty)
.enumerate()
.map(|(i, c)| format!("{}{i}: {}", arg_to_name(c), arg_to_type(c))),
);
writeln!(generated, "pub fn {name}({args}) -> (usize, [u8; MAX_SIZE]) {{")?;
let arg_names =
comma_sep(iter_args(ty).enumerate().map(|(i, c)| format!("{}{i}", arg_to_name(c))));
writeln!(generated, " unsafe {{ crate::encode({ty}({op}, {arg_names})) }}")?;
writeln!(generated, "}}")?;
}
}
'_structs: {
let mut seen = std::collections::HashSet::new();
for [_, _, ty, _] in instructions() {
if !seen.insert(ty) {
continue;
}
let types = comma_sep(iter_args(ty).map(arg_to_type).map(|s| s.to_string()));
writeln!(generated, "#[repr(packed)] pub struct {ty}(u8, {types});")?;
}
}
'_name_list: {
writeln!(generated, "pub const COUNT: u8 = {};", instructions().count())?;
}
let instr = "Instr";
let oper = "Oper";
'_instr_enum: {
writeln!(generated, "#[derive(Debug, Clone, Copy, PartialEq, Eq)] #[repr(u8)]")?;
writeln!(generated, "pub enum {instr} {{")?;
for [id, name, ..] in instructions() {
writeln!(generated, " {name} = {id},")?;
}
writeln!(generated, "}}")?;
}
'_arg_kind: {
writeln!(generated, "#[derive(Debug, Clone, Copy, PartialEq, Eq)]")?;
writeln!(generated, "pub enum {oper} {{")?;
let mut seen = HashSet::new();
for ty in instructions().flat_map(|[.., ty, _]| iter_args(ty)) {
if !seen.insert(ty) {
continue;
}
writeln!(generated, " {ty}({}),", arg_to_type(ty))?;
}
writeln!(generated, "}}")?;
}
'_parse_opers: {
writeln!(
generated,
"/// This assumes the instruction byte is still at the beginning of the buffer"
)?;
writeln!(generated, "#[cfg(feature = \"disasm\")]")?;
writeln!(generated, "pub fn parse_args(bytes: &mut &[u8], kind: {instr}, buf: &mut alloc::vec::Vec<{oper}>) -> Option<()> {{")?;
writeln!(generated, " match kind {{")?;
let mut instrs = instructions().collect::<Vec<_>>();
instrs.sort_unstable_by_key(|&[.., ty, _]| ty);
for group in instrs.chunk_by(|[.., a, _], [.., b, _]| a == b) {
let ty = group[0][2];
for &[_, name, ..] in group {
writeln!(generated, " | {instr}::{name}")?;
}
generated.pop();
writeln!(generated, " => {{")?;
if iter_args(ty).count() != 0 {
writeln!(generated, " let data = crate::decode::<{ty}>(bytes)?;")?;
writeln!(
generated,
" buf.extend([{}]);",
comma_sep(
iter_args(ty).zip(1u32..).map(|(t, i)| format!("{oper}::{t}(data.{i})"))
)
)?;
} else {
writeln!(generated, " crate::decode::<{ty}>(bytes)?;")?;
}
writeln!(generated, " }}")?;
}
writeln!(generated, " }}")?;
writeln!(generated, " Some(())")?;
writeln!(generated, "}}")?;
}
std::fs::write("src/instrs.rs", generated)?;
Ok(())
}
fn comma_sep(items: impl Iterator<Item = String>) -> String {
items.map(|item| item.to_string()).collect::<Vec<_>>().join(", ")
}
fn instructions() -> impl Iterator<Item = [&'static str; 4]> {
include_str!("instructions.in")
.lines()
.filter_map(|line| line.strip_suffix(';'))
.map(|line| line.splitn(4, ',').map(str::trim).next_chunk().unwrap())
}
fn arg_to_type(arg: char) -> &'static str {
match arg {
'R' | 'B' => "u8",
'H' => "u16",
'W' => "u32",
'D' | 'A' => "u64",
'P' => "i16",
'O' => "i32",
_ => panic!("unknown type: {}", arg),
}
}
fn arg_to_width(arg: char) -> usize {
match arg {
'R' | 'B' => 1,
'H' => 2,
'W' => 4,
'D' | 'A' => 8,
'P' => 2,
'O' => 4,
_ => panic!("unknown type: {}", arg),
}
}
fn arg_to_name(arg: char) -> &'static str {
match arg {
'R' => "reg",
'B' | 'H' | 'W' | 'D' => "imm",
'P' | 'O' => "offset",
'A' => "addr",
_ => panic!("unknown type: {}", arg),
}
}
fn iter_args(ty: &'static str) -> impl Iterator<Item = char> {
ty.chars().filter(|c| *c != 'N')
}

284
bytecode/src/lib.rs Normal file
View file

@ -0,0 +1,284 @@
#![no_std]
#[cfg(feature = "disasm")]
extern crate alloc;
pub use crate::instrs::*;
use core::convert::TryFrom;
mod instrs;
type OpR = u8;
type OpA = u64;
type OpO = i32;
type OpP = i16;
type OpB = u8;
type OpH = u16;
type OpW = u32;
type OpD = u64;
/// # Safety
/// Has to be valid to be decoded from bytecode.
pub unsafe trait BytecodeItem {}
unsafe impl BytecodeItem for u8 {}
impl TryFrom<u8> for Instr {
type Error = u8;
#[inline]
fn try_from(value: u8) -> Result<Self, Self::Error> {
#[cold]
fn failed(value: u8) -> Result<Instr, u8> {
Err(value)
}
if value < COUNT {
unsafe { Ok(core::mem::transmute::<u8, Instr>(value)) }
} else {
failed(value)
}
}
}
#[inline]
unsafe fn encode<T>(instr: T) -> (usize, [u8; instrs::MAX_SIZE]) {
let mut buf = [0; instrs::MAX_SIZE];
core::ptr::write(buf.as_mut_ptr() as *mut T, instr);
(core::mem::size_of::<T>(), buf)
}
#[inline]
#[cfg(feature = "disasm")]
fn decode<T>(binary: &mut &[u8]) -> Option<T> {
let (front, rest) = core::mem::take(binary).split_at_checked(core::mem::size_of::<T>())?;
*binary = rest;
unsafe { Some(core::ptr::read(front.as_ptr() as *const T)) }
}
/// Rounding mode
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
#[repr(u8)]
pub enum RoundingMode {
NearestEven = 0,
Truncate = 1,
Up = 2,
Down = 3,
}
impl TryFrom<u8> for RoundingMode {
type Error = ();
fn try_from(value: u8) -> Result<Self, Self::Error> {
(value <= 3).then(|| unsafe { core::mem::transmute(value) }).ok_or(())
}
}
#[cfg(feature = "disasm")]
#[derive(Clone, Copy)]
pub enum DisasmItem {
Func,
Global,
}
#[cfg(feature = "disasm")]
#[derive(Debug)]
pub enum DisasmError<'a> {
InvalidInstruction(u8),
InstructionOutOfBounds(&'a str),
FmtFailed(core::fmt::Error),
HasOutOfBoundsJumps,
}
#[cfg(feature = "disasm")]
impl From<core::fmt::Error> for DisasmError<'_> {
fn from(value: core::fmt::Error) -> Self {
Self::FmtFailed(value)
}
}
#[cfg(feature = "disasm")]
impl core::fmt::Display for DisasmError<'_> {
fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result {
match *self {
DisasmError::InvalidInstruction(b) => write!(f, "invalid instruction opcode: {b}"),
DisasmError::InstructionOutOfBounds(name) => {
write!(f, "instruction would go out of bounds of {name} symbol")
}
DisasmError::FmtFailed(error) => write!(f, "fmt failed: {error}"),
DisasmError::HasOutOfBoundsJumps => write!(
f,
"the code contained jumps that dont got neither to a \
valid symbol or local insturction"
),
}
}
}
#[cfg(feature = "disasm")]
impl core::error::Error for DisasmError<'_> {}
#[cfg(feature = "disasm")]
pub fn disasm<'a>(
binary: &mut &[u8],
functions: &alloc::collections::BTreeMap<u32, (&'a str, u32, DisasmItem)>,
out: &mut alloc::string::String,
mut eca_handler: impl FnMut(&mut &[u8]),
) -> Result<(), DisasmError<'a>> {
use {
self::instrs::Instr,
alloc::{
collections::btree_map::{BTreeMap, Entry},
vec::Vec,
},
core::{convert::TryInto, fmt::Write},
};
fn instr_from_byte(b: u8) -> Result<Instr, DisasmError<'static>> {
b.try_into().map_err(DisasmError::InvalidInstruction)
}
let mut labels = BTreeMap::<u32, u32>::default();
let mut buf = Vec::<instrs::Oper>::new();
let mut has_oob = false;
'_offset_pass: for (&off, &(name, len, kind)) in functions.iter() {
if matches!(kind, DisasmItem::Global) {
continue;
}
let prev = *binary;
*binary = &binary[off as usize..];
let mut label_count = 0;
while let Some(&byte) = binary.first() {
let offset: i32 = (prev.len() - binary.len()).try_into().unwrap();
if offset as u32 == off + len {
break;
}
let Ok(inst) = instr_from_byte(byte) else { break };
instrs::parse_args(binary, inst, &mut buf)
.ok_or(DisasmError::InstructionOutOfBounds(name))?;
for op in buf.drain(..) {
let rel = match op {
instrs::Oper::O(rel) => rel,
instrs::Oper::P(rel) => rel.into(),
_ => continue,
};
let global_offset: u32 = (offset + rel).try_into().unwrap();
if functions.get(&global_offset).is_some() {
continue;
}
label_count += match labels.entry(global_offset) {
Entry::Occupied(_) => 0,
Entry::Vacant(entry) => {
entry.insert(label_count);
1
}
}
}
if matches!(inst, Instr::ECA) {
eca_handler(binary);
}
}
*binary = prev;
}
let mut ordered = functions.iter().collect::<Vec<_>>();
ordered.sort_unstable_by_key(|(_, (name, _, _))| name);
'_dump: for (&off, &(name, len, kind)) in ordered {
if matches!(kind, DisasmItem::Global) {
continue;
}
let prev = *binary;
writeln!(out, "{name}:")?;
*binary = &binary[off as usize..];
while let Some(&byte) = binary.first() {
let offset: i32 = (prev.len() - binary.len()).try_into().unwrap();
if offset as u32 == off + len {
break;
}
let Ok(inst) = instr_from_byte(byte) else {
writeln!(out, "invalid instr {byte}")?;
break;
};
instrs::parse_args(binary, inst, &mut buf).unwrap();
if let Some(label) = labels.get(&offset.try_into().unwrap()) {
write!(out, "{:>2}: ", label)?;
} else {
write!(out, " ")?;
}
write!(out, "{inst:<8?} ")?;
'a: for (i, op) in buf.drain(..).enumerate() {
if i != 0 {
write!(out, ", ")?;
}
let rel = 'b: {
match op {
instrs::Oper::O(rel) => break 'b rel,
instrs::Oper::P(rel) => break 'b rel.into(),
instrs::Oper::R(r) => write!(out, "r{r}")?,
instrs::Oper::B(b) => write!(out, "{b}b")?,
instrs::Oper::H(h) => write!(out, "{h}h")?,
instrs::Oper::W(w) => write!(out, "{w}w")?,
instrs::Oper::D(d) if (d as i64) < 0 => write!(out, "{}d", d as i64)?,
instrs::Oper::D(d) => write!(out, "{d}d")?,
instrs::Oper::A(a) => write!(out, "{a}a")?,
}
continue 'a;
};
let global_offset: u32 = (offset + rel).try_into().unwrap();
if let Some(&(name, ..)) = functions.get(&global_offset) {
if name.contains('\0') {
write!(out, ":{name:?}")?;
} else {
write!(out, ":{name}")?;
}
} else {
let local_has_oob = global_offset < off
|| global_offset > off + len
|| prev
.get(global_offset as usize)
.map_or(true, |&b| instr_from_byte(b).is_err())
|| prev[global_offset as usize] == 0;
has_oob |= local_has_oob;
let label = labels.get(&global_offset).unwrap();
if local_has_oob {
write!(out, "!!!!!!!!!{rel}")?;
} else {
write!(out, ":{label}")?;
}
}
}
writeln!(out)?;
if matches!(inst, Instr::ECA) {
eca_handler(binary);
}
}
*binary = prev;
}
if has_oob {
return Err(DisasmError::HasOutOfBoundsJumps);
}
Ok(())
}

23
depell/Cargo.toml Normal file
View file

@ -0,0 +1,23 @@
[package]
name = "depell"
version = "0.1.0"
edition = "2021"
[dependencies]
argon2 = "0.5.3"
axum = "0.7.7"
axum-server = { version = "0.7.1", optional = true, features = ["rustls", "tls-rustls"] }
const_format = "0.2.33"
getrandom = "0.2.15"
hblang.workspace = true
htmlm = "0.5.0"
log = "0.4.22"
rand_core = { version = "0.6.4", features = ["getrandom"] }
rusqlite = { version = "0.32.1", features = ["bundled"] }
serde = { version = "1.0.210", features = ["derive"] }
time = "0.3.36"
tokio = { version = "1.40.0", features = ["rt"] }
[features]
#default = ["tls"]
tls = ["dep:axum-server"]

14
depell/README.md Normal file
View file

@ -0,0 +1,14 @@
# Depell
Depell is a website that allows users to import/post/run hblang code and create huge dependency graphs. Its currently hosted at https://depell.mlokis.tech.
## Local Development
Prerequirements:
- rust nigthly toolchain: install rust from [here](https://www.rust-lang.org/tools/install)
```bash
rustup default nightly
cargo xtask watch-depell-debug
# browser http://localhost:8080
```

View file

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" height="24px" viewBox="0 -960 960 960" width="24px" fill="#e8eaed"><path d="M480-320 280-520l56-58 104 104v-326h80v326l104-104 56 58-200 200ZM240-160q-33 0-56.5-23.5T160-240v-120h80v120h480v-120h80v120q0 33-23.5 56.5T720-160H240Z"/></svg>

After

Width:  |  Height:  |  Size: 279 B

1
depell/src/icons/run.svg Normal file
View file

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" height="24px" viewBox="0 -960 960 960" width="24px" fill="#e8eaed"><path d="M320-200v-560l440 280-440 280Zm80-280Zm0 134 210-134-210-134v268Z"/></svg>

After

Width:  |  Height:  |  Size: 190 B

207
depell/src/index.css Normal file
View file

@ -0,0 +1,207 @@
* {
font-family: var(--font);
}
body {
--primary: light-dark(white, #181A1B);
--secondary: light-dark(#EFEFEF, #212425);
--timestamp: light-dark(#555555, #AAAAAA);
--error: #ff3333;
}
body {
--small-gap: 5px;
--font: system-ui, -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Open Sans', 'Helvetica Neue', sans-serif;
--monospace: 'Courier New', Courier, monospace;
nav {
display: flex;
justify-content: space-between;
align-items: center;
section:last-child {
display: flex;
gap: var(--small-gap);
}
}
main {
margin-top: var(--small-gap);
display: flex;
flex-direction: column;
gap: var(--small-gap);
}
}
div.preview {
margin: var(--small-gap) 0px;
display: flex;
flex-direction: column;
gap: var(--small-gap);
div.info {
display: flex;
gap: var(--small-gap);
span[apply=timestamp] {
color: var(--timestamp);
}
}
div.stat {
display: flex;
svg {
height: 18px;
}
}
div.code {
position: relative;
nav {
position: absolute;
right: 0;
padding: var(--small-gap);
button {
display: flex;
padding: 0;
}
}
}
}
svg {
fill: black;
}
form {
display: flex;
flex-direction: column;
gap: var(--small-gap);
.error {
color: var(--error);
text-align: center;
}
}
textarea {
outline: none;
border: none;
background: var(--secondary);
padding: var(--small-gap);
padding-top: calc(var(--small-gap) * 1.5);
font-family: var(--monospace);
resize: none;
tab-size: 4;
}
pre {
background: var(--secondary);
padding: var(--small-gap);
padding-top: calc(var(--small-gap) * 1.5);
margin: 0px;
font-family: var(--monospace);
tab-size: 4;
overflow-x: auto;
white-space: pre-wrap;
word-wrap: break-word;
}
input {
font-size: inherit;
outline: none;
border: none;
background: var(--secondary);
padding: var(--small-gap);
}
input:is(:hover, :focus) {
background: var(--primary);
}
button {
border: none;
outline: none;
font-size: inherit;
background: var(--secondary);
}
button:hover:not(:active) {
background: var(--primary);
}
div#code-editor {
display: flex;
position: relative;
textarea {
flex: 1;
}
span#code-size {
position: absolute;
right: 2px;
font-size: 12px;
}
}
div#dep-list {
display: flex;
flex-direction: column;
align-items: center;
gap: var(--small-gap);
section {
width: 100%;
display: flex;
flex-direction: column;
text-align: center;
gap: var(--small-gap);
div {
text-align: left;
}
}
}
.syn {
font-family: var(--monospace);
&.Comment {
color: #939f91;
}
&.Keyword {
color: #f85552;
}
&.Identifier,
&.Directive {
color: #3a94c5;
}
/* &.Number {} */
&.String {
color: #8da101;
}
&.Op,
&.Assign {
color: #f57d26;
}
&.Paren,
&.Bracket,
&.Comma,
&.Dot,
&.Ctor,
&.Colon {
color: light-dark(#5c6a72, #999999);
}
}

551
depell/src/index.js Normal file
View file

@ -0,0 +1,551 @@
/// @ts-check
/** @return {never} */
function never() { throw new Error() }
/**@type{WebAssembly.Instance}*/ let hbcInstance;
/**@type{Promise<WebAssembly.WebAssemblyInstantiatedSource>}*/ let hbcInstaceFuture;
async function getHbcInstance() {
hbcInstaceFuture ??= WebAssembly.instantiateStreaming(fetch("/hbc.wasm"), {});
return hbcInstance ??= (await hbcInstaceFuture).instance;
}
const stack_pointer_offset = 1 << 20;
/** @param {WebAssembly.Instance} instance @param {Post[]} packages @param {number} fuel
* @returns {string} */
function compileCode(instance, packages, fuel = 100) {
let {
INPUT, INPUT_LEN,
LOG_MESSAGES, LOG_MESSAGES_LEN,
memory, compile_and_run,
} = instance.exports;
if (!(true
&& memory instanceof WebAssembly.Memory
&& INPUT instanceof WebAssembly.Global
&& INPUT_LEN instanceof WebAssembly.Global
&& LOG_MESSAGES instanceof WebAssembly.Global
&& LOG_MESSAGES_LEN instanceof WebAssembly.Global
&& typeof compile_and_run === "function"
)) never();
const codeLength = packPosts(packages, new DataView(memory.buffer, INPUT.value));
new DataView(memory.buffer).setUint32(INPUT_LEN.value, codeLength, true);
runWasmFunction(instance, compile_and_run, fuel);
return bufToString(memory, LOG_MESSAGES, LOG_MESSAGES_LEN).trim();
}
/**@type{WebAssembly.Instance}*/ let fmtInstance;
/**@type{Promise<WebAssembly.WebAssemblyInstantiatedSource>}*/ let fmtInstaceFuture;
async function getFmtInstance() {
fmtInstaceFuture ??= WebAssembly.instantiateStreaming(fetch("/hbfmt.wasm"), {});
return fmtInstance ??= (await fmtInstaceFuture).instance;
}
/** @param {WebAssembly.Instance} instance @param {string} code @param {"tok" | "fmt" | "minify"} action
* @returns {string | Uint8Array | undefined} */
function modifyCode(instance, code, action) {
let {
INPUT, INPUT_LEN,
OUTPUT, OUTPUT_LEN,
memory, fmt, tok, minify
} = instance.exports;
let funs = { fmt, tok, minify };
let fun = funs[action];
if (!(true
&& memory instanceof WebAssembly.Memory
&& INPUT instanceof WebAssembly.Global
&& INPUT_LEN instanceof WebAssembly.Global
&& OUTPUT instanceof WebAssembly.Global
&& OUTPUT_LEN instanceof WebAssembly.Global
&& funs.hasOwnProperty(action)
&& typeof fun === "function"
)) never();
if (action !== "fmt") {
INPUT = OUTPUT;
INPUT_LEN = OUTPUT_LEN;
}
let dw = new DataView(memory.buffer);
dw.setUint32(INPUT_LEN.value, code.length, true);
new Uint8Array(memory.buffer, INPUT.value).set(new TextEncoder().encode(code));
if (!runWasmFunction(instance, fun)) {
return undefined;
}
if (action === "tok") {
return bufSlice(memory, OUTPUT, OUTPUT_LEN);
} else {
return bufToString(memory, OUTPUT, OUTPUT_LEN);
}
}
/** @param {WebAssembly.Instance} instance @param {CallableFunction} func @param {any[]} args
* @returns {boolean} */
function runWasmFunction(instance, func, ...args) {
const { PANIC_MESSAGE, PANIC_MESSAGE_LEN, memory, stack_pointer } = instance.exports;
if (!(true
&& memory instanceof WebAssembly.Memory
&& stack_pointer instanceof WebAssembly.Global
)) never();
const ptr = stack_pointer.value;
try {
func(...args);
return true;
} catch (error) {
if (error instanceof WebAssembly.RuntimeError
&& error.message == "unreachable"
&& PANIC_MESSAGE instanceof WebAssembly.Global
&& PANIC_MESSAGE_LEN instanceof WebAssembly.Global) {
console.error(bufToString(memory, PANIC_MESSAGE, PANIC_MESSAGE_LEN), error);
} else {
console.error(error);
}
stack_pointer.value = ptr;
return false;
}
}
/** @typedef {Object} Post
* @property {string} path
* @property {string} code */
/** @param {Post[]} posts @param {DataView} view @returns {number} */
function packPosts(posts, view) {
const enc = new TextEncoder(), buf = new Uint8Array(view.buffer, view.byteOffset);
let len = 0; for (const post of posts) {
view.setUint16(len, post.path.length, true); len += 2;
buf.set(enc.encode(post.path), len); len += post.path.length;
view.setUint16(len, post.code.length, true); len += 2;
buf.set(enc.encode(post.code), len); len += post.code.length;
}
return len;
}
/** @param {WebAssembly.Memory} mem
* @param {WebAssembly.Global} ptr
* @param {WebAssembly.Global} len
* @return {Uint8Array} */
function bufSlice(mem, ptr, len) {
return new Uint8Array(mem.buffer, ptr.value,
new DataView(mem.buffer).getUint32(len.value, true));
}
/** @param {WebAssembly.Memory} mem
* @param {WebAssembly.Global} ptr
* @param {WebAssembly.Global} len
* @return {string} */
function bufToString(mem, ptr, len) {
const res = new TextDecoder()
.decode(new Uint8Array(mem.buffer, ptr.value,
new DataView(mem.buffer).getUint32(len.value, true)));
new DataView(mem.buffer).setUint32(len.value, 0, true);
return res;
}
/** @param {HTMLElement} target */
function wireUp(target) {
execApply(target);
cacheInputs(target);
bindCodeEdit(target);
bindTextareaAutoResize(target);
}
const importRe = /@use\s*\(\s*"(([^"]|\\")+)"\s*\)/g;
/** @param {WebAssembly.Instance} fmt
* @param {string} code
* @param {string[]} roots
* @param {Post[]} buf
* @param {Set<string>} prevRoots
* @returns {void} */
function loadCachedPackages(fmt, code, roots, buf, prevRoots) {
buf[0].code = code;
roots.length = 0;
let changed = false;
for (const match of code.matchAll(importRe)) {
changed ||= !prevRoots.has(match[1]);
roots.push(match[1]);
}
if (!changed) return;
buf.length = 1;
prevRoots.clear();
for (let imp = roots.pop(); imp !== undefined; imp = roots.pop()) {
if (prevRoots.has(imp)) continue; prevRoots.add(imp);
const fmtd = modifyCode(fmt, localStorage.getItem("package-" + imp) ?? never(), "fmt");
if (typeof fmtd != "string") never();
buf.push({ path: imp, code: fmtd });
for (const match of buf[buf.length - 1].code.matchAll(importRe)) {
roots.push(match[1]);
}
}
}
/**@type{Set<string>}*/ const prevRoots = new Set();
/**@typedef {Object} PackageCtx
* @property {AbortController} [cancelation]
* @property {string[]} keyBuf
* @property {Set<string>} prevParams
* @property {HTMLTextAreaElement} [edit] */
/** @param {string} source @param {Set<string>} importDiff @param {HTMLPreElement} errors @param {PackageCtx} ctx */
async function fetchPackages(source, importDiff, errors, ctx) {
importDiff.clear();
for (const match of source.matchAll(importRe)) {
if (localStorage["package-" + match[1]]) continue;
importDiff.add(match[1]);
}
if (importDiff.size !== 0 && (ctx.prevParams.size != importDiff.size
|| [...ctx.prevParams.keys()].every(e => importDiff.has(e)))) {
if (ctx.cancelation) ctx.cancelation.abort();
ctx.prevParams.clear();
ctx.prevParams = new Set([...importDiff]);
ctx.cancelation = new AbortController();
ctx.keyBuf.length = 0;
ctx.keyBuf.push(...importDiff.keys());
errors.textContent = "fetching: " + ctx.keyBuf.join(", ");
await fetch(`/code`, {
method: "POST",
signal: ctx.cancelation.signal,
headers: { "Content-Type": "application/json" },
body: JSON.stringify(ctx.keyBuf),
}).then(async e => {
try {
const json = await e.json();
if (e.status == 200) {
for (const [key, value] of Object.entries(json)) {
localStorage["package-" + key] = value;
}
const missing = ctx.keyBuf.filter(i => json[i] === undefined);
if (missing.length !== 0) {
errors.textContent = "deps not found: " + missing.join(", ");
} else {
ctx.cancelation = undefined;
ctx.edit?.dispatchEvent(new InputEvent("input"));
}
}
} catch (er) {
errors.textContent = "completely failed to fetch ("
+ e.status + "): " + ctx.keyBuf.join(", ");
console.error(e, er);
}
});
}
}
/** @param {HTMLElement} target */
async function bindCodeEdit(target) {
const edit = target.querySelector("#code-edit");
if (!(edit instanceof HTMLTextAreaElement)) return;
const codeSize = target.querySelector("#code-size");
const errors = target.querySelector("#compiler-output");
if (!(true
&& codeSize instanceof HTMLSpanElement
&& errors instanceof HTMLPreElement
)) never();
const MAX_CODE_SIZE = parseInt(codeSize.innerHTML);
if (Number.isNaN(MAX_CODE_SIZE)) never();
const hbc = await getHbcInstance(), fmt = await getFmtInstance();
let importDiff = new Set();
/**@type{Post[]}*/
const packages = [{ path: "local.hb", code: "" }];
const debounce = 100;
let timeout = 0;
const ctx = { keyBuf: [], prevParams: new Set(), edit };
prevRoots.clear();
const onInput = () => {
fetchPackages(edit.value, importDiff, errors, ctx);
if (ctx.cancelation && importDiff.size !== 0) {
return;
}
loadCachedPackages(fmt, edit.value, ctx.keyBuf, packages, prevRoots);
errors.textContent = compileCode(hbc, packages);
const minified_size = modifyCode(fmt, edit.value, "minify")?.length;
if (minified_size) {
codeSize.textContent = (MAX_CODE_SIZE - minified_size) + "";
const perc = Math.min(100, Math.floor(100 * (minified_size / MAX_CODE_SIZE)));
codeSize.style.color = `color-mix(in srgb, light-dark(black, white), var(--error) ${perc}%)`;
}
timeout = 0;
};
edit.addEventListener("input", () => {
if (timeout) clearTimeout(timeout);
timeout = setTimeout(onInput, debounce)
});
edit.dispatchEvent(new InputEvent("input"));
}
/**
* @type {Array<string>}
* to be synched with `enum TokenGroup` in bytecode/src/fmt.rs */
const TOK_CLASSES = [
'Blank',
'Comment',
'Keyword',
'Identifier',
'Directive',
'Number',
'String',
'Op',
'Assign',
'Paren',
'Bracket',
'Colon',
'Comma',
'Dot',
'Ctor',
];
/** @type {{ [key: string]: (el: HTMLElement) => void | Promise<void> }} */
const applyFns = {
timestamp: (el) => {
const timestamp = el.innerText;
const date = new Date(parseInt(timestamp) * 1000);
el.innerText = date.toLocaleString();
},
fmt,
};
/**
* @param {HTMLElement} target */
async function fmt(target) {
const code = target.innerText;
const instance = await getFmtInstance();
const decoder = new TextDecoder('utf-8');
const fmt = modifyCode(instance, code, 'fmt');
if (typeof fmt !== "string") never()
const codeBytes = new TextEncoder().encode(fmt);
const tok = modifyCode(instance, fmt, 'tok');
if (!(tok instanceof Uint8Array)) never();
target.innerHTML = '';
let start = 0;
let kind = tok[0];
for (let ii = 1; ii <= tok.length; ii += 1) {
// split over same tokens and buffer end
if (tok[ii] === kind && ii < tok.length) {
continue;
}
const text = decoder.decode(codeBytes.subarray(start, ii));
const textNode = document.createTextNode(text);;
if (kind === 0) {
target.appendChild(textNode);
} else {
const el = document.createElement('span');
el.classList.add('syn');
el.classList.add(TOK_CLASSES[kind]);
el.appendChild(textNode);
target.appendChild(el);
}
if (ii == tok.length) {
break;
}
start = ii;
kind = tok[ii];
}
}
/** @param {HTMLElement} target */
function execApply(target) {
const proises = [];
for (const elem of target.querySelectorAll('[apply]')) {
if (!(elem instanceof HTMLElement)) continue;
const funcname = elem.getAttribute('apply') ?? never();
const vl = applyFns[funcname](elem);
if (vl instanceof Promise) proises.push(vl);
}
if (target === document.body) {
Promise.all(proises).then(() => document.body.hidden = false);
}
}
/** @param {HTMLElement} target */
function bindTextareaAutoResize(target) {
for (const textarea of target.querySelectorAll("textarea")) {
if (!(textarea instanceof HTMLTextAreaElement)) never();
const taCssMap = window.getComputedStyle(textarea);
const padding = parseInt(taCssMap.getPropertyValue('padding-top') ?? "0")
+ parseInt(taCssMap.getPropertyValue('padding-top') ?? "0");
textarea.style.height = "auto";
textarea.style.height = (textarea.scrollHeight - padding) + "px";
textarea.style.overflowY = "hidden";
textarea.addEventListener("input", function() {
let top = window.scrollY;
textarea.style.height = "auto";
textarea.style.height = (textarea.scrollHeight - padding) + "px";
window.scrollTo({ top });
});
textarea.onkeydown = (ev) => {
if (ev.key === "Tab") {
ev.preventDefault();
document.execCommand('insertText', false, "\t");
}
}
}
}
/** @param {HTMLElement} target */
function cacheInputs(target) {
/**@type {HTMLFormElement}*/ let form;
for (form of target.querySelectorAll('form')) {
const path = form.getAttribute('hx-post') || form.getAttribute('hx-delete');
if (!path) {
console.warn('form does not have a hx-post or hx-delete attribute', form);
continue;
}
for (const input of form.elements) {
if (input instanceof HTMLInputElement || input instanceof HTMLTextAreaElement) {
if ('password submit button'.includes(input.type)) continue;
const key = path + input.name;
input.value = localStorage.getItem(key) ?? '';
input.addEventListener("input", () => localStorage.setItem(key, input.value));
} else {
console.warn("unhandled form element: ", input);
}
}
}
}
/** @param {string} [path] */
function updateTab(path) {
for (const elem of document.querySelectorAll("button[hx-push-url]")) {
if (elem instanceof HTMLButtonElement)
elem.disabled = elem.getAttribute("hx-push-url") === (path ?? window.location.pathname);
}
}
if (window.location.hostname === 'localhost') {
let id; setInterval(async () => {
let new_id = await fetch('/hot-reload').then(reps => reps.text());
id ??= new_id;
if (id !== new_id) window.location.reload();
}, 300);
(async function test() {
{
const code = "main:=fn():void{return}";
const inst = await getFmtInstance()
const fmtd = modifyCode(inst, code, "fmt") ?? never();
if (typeof fmtd !== "string") never();
const prev = modifyCode(inst, fmtd, "minify") ?? never();
if (code != prev) console.error(code, prev);
}
{
const posts = [{
path: "foo.hb",
code: "main:=fn():int{return 42}",
}];
const res = compileCode(await getHbcInstance(), posts, 1) ?? never();
const expected = "exit code: 42";
if (expected != res) console.error(expected, res);
}
})()
}
document.body.addEventListener('htmx:afterSwap', (ev) => {
if (!(ev.target instanceof HTMLElement)) never();
wireUp(ev.target);
if (ev.target.tagName == "MAIN" || ev.target.tagName == "BODY")
updateTab(ev['detail'].pathInfo.finalRequestPath);
});
getFmtInstance().then(inst => {
document.body.addEventListener('htmx:configRequest', (ev) => {
const details = ev['detail'];
if (details.path === "/post" && details.verb === "post") {
details.parameters['code'] = modifyCode(inst, details.parameters['code'], "minify");
}
});
/** @param {string} query @param {string} target @returns {number} */
function fuzzyCost(query, target) {
let qi = 0, bi = 0, cost = 0, matched = false;
while (qi < query.length) {
if (query.charAt(qi) === target.charAt(bi++)) {
matched = true;
qi++;
} else {
cost++;
}
if (bi === target.length) (bi = 0, qi++);
}
return cost + (matched ? 0 : 100 * target.length);
}
let deps = undefined;
/** @param {HTMLInputElement} input @returns {void} */
function filterCodeDeps(input) {
deps ??= document.getElementById("deps");
if (!(deps instanceof HTMLElement)) never();
if (input.value === "") {
deps.textContent = "results show here...";
return;
}
deps.innerHTML = "";
for (const root of [...prevRoots.keys()]
.sort((a, b) => fuzzyCost(input.value, a) - fuzzyCost(input.value, b))) {
const pane = document.createElement("div");
const code = modifyCode(inst, localStorage["package-" + root], "fmt");
pane.innerHTML = `<div>${root}</div><pre>${code}</pre>`;
deps.appendChild(pane);
}
if (deps.innerHTML === "") {
deps.textContent = "no results";
}
}
Object.assign(window, { filterCodeDeps });
});
/** @param {HTMLElement} target */
function runPost(target) {
while (!target.matches("div[class=preview]")) target = target.parentElement ?? never();
const code = target.querySelector("pre[apply=fmt]");
if (!(code instanceof HTMLPreElement)) never();
const output = target.querySelector("pre[id=compiler-output]");
if (!(output instanceof HTMLPreElement)) never();
Promise.all([getHbcInstance(), getFmtInstance()]).then(async ([hbc, fmt]) => {
const ctx = { keyBuf: [], prevParams: new Set() };
await fetchPackages(code.innerText ?? never(), new Set(), output, ctx);
const posts = [{ path: "this", code: "" }];
loadCachedPackages(fmt, code.innerText ?? never(), ctx.keyBuf, posts, new Set());
output.textContent = compileCode(hbc, posts);
output.hidden = false;
});
let author = encodeURIComponent(target.dataset.author ?? never());
let name = encodeURIComponent(target.dataset.name ?? never());
fetch(`/post/run?author=${author}&name=${name}`, { method: "POST" })
}
Object.assign(window, { runPost });
updateTab();
wireUp(document.body);

972
depell/src/main.rs Normal file
View file

@ -0,0 +1,972 @@
#![feature(iter_collect_into)]
use {
argon2::{password_hash::SaltString, PasswordVerifier},
axum::{
body::Bytes,
extract::{DefaultBodyLimit, Path},
http::{header::COOKIE, request::Parts, StatusCode},
response::{AppendHeaders, Html},
},
const_format::formatcp,
core::fmt,
htmlm::{html, write_html},
rand_core::OsRng,
serde::{Deserialize, Serialize},
std::{
collections::{HashMap, HashSet},
fmt::{Display, Write},
net::Ipv4Addr,
},
};
const MAX_NAME_LENGTH: usize = 32;
const MAX_POSTNAME_LENGTH: usize = 64;
const MAX_CODE_LENGTH: usize = 1024 * 4;
const SESSION_DURATION_SECS: u64 = 60 * 60;
const MAX_FEED_SIZE: usize = 8 * 1024;
type Redirect<const COUNT: usize = 1> = AppendHeaders<[(&'static str, &'static str); COUNT]>;
macro_rules! static_asset {
($mime:literal, $body:literal) => {
get(|| async {
axum::http::Response::builder()
.header("content-type", $mime)
.header("content-encoding", "gzip")
.body(axum::body::Body::from(Bytes::from_static(include_bytes!(concat!(
$body, ".gz"
)))))
.unwrap()
})
};
}
async fn amain() {
use axum::routing::{delete, get, post};
let debug = cfg!(debug_assertions);
log::set_logger(&Logger).unwrap();
log::set_max_level(if debug { log::LevelFilter::Warn } else { log::LevelFilter::Error });
db::init();
let router = axum::Router::new()
.route("/", get(Index::page))
.route("/index.css", static_asset!("text/css", "index.css"))
.route("/index.js", static_asset!("text/javascript", "index.js"))
.route("/hbfmt.wasm", static_asset!("application/wasm", "hbfmt.wasm"))
.route("/hbc.wasm", static_asset!("application/wasm", "hbc.wasm"))
.route("/index-view", get(Index::get))
.route("/feed", get(Feed::page))
.route("/feed-view", get(Feed::get))
.route("/feed-more", post(Feed::more))
.route("/profile", get(Profile::page))
.route("/profile-view", get(Profile::get))
.route("/profile/:name", get(Profile::get_other_page))
.route("/profile/password", post(PasswordChange::post))
.route("/profile-view/:name", get(Profile::get_other))
.route("/post", get(Post::page))
.route("/post-view", get(Post::get))
.route("/post", post(Post::post))
.route("/post/run", post(Post::run))
.route("/code", post(fetch_code))
.route("/login", get(Login::page))
.route("/login-view", get(Login::get))
.route("/login", post(Login::post))
.route("/login", delete(Login::delete))
.route("/signup", get(Signup::page))
.route("/signup-view", get(Signup::get))
.route("/signup", post(Signup::post))
.route(
"/hot-reload",
get({
let id = std::time::SystemTime::now()
.duration_since(std::time::SystemTime::UNIX_EPOCH)
.unwrap()
.as_millis();
move || async move { id.to_string() }
}),
)
.layer(DefaultBodyLimit::max(16 * 1024));
#[cfg(feature = "tls")]
{
let addr =
(Ipv4Addr::UNSPECIFIED, std::env::var("DEPELL_PORT").unwrap().parse::<u16>().unwrap());
let config = axum_server::tls_rustls::RustlsConfig::from_pem_file(
std::env::var("DEPELL_CERT_PATH").unwrap(),
std::env::var("DEPELL_KEY_PATH").unwrap(),
)
.await
.unwrap();
axum_server::bind_rustls(addr.into(), config)
.serve(router.into_make_service())
.await
.unwrap();
}
#[cfg(not(feature = "tls"))]
{
let addr = (Ipv4Addr::UNSPECIFIED, 8080);
let socket = tokio::net::TcpListener::bind(addr).await.unwrap();
axum::serve(socket, router).await.unwrap();
}
}
async fn fetch_code(
axum::Json(paths): axum::Json<Vec<String>>,
) -> axum::Json<HashMap<String, String>> {
let mut deps = HashMap::<String, String>::new();
db::with(|db| {
for path in &paths {
let Some((author, name)) = path.split_once('/') else { continue };
db.fetch_deps
.query_map((name, author), |r| {
Ok((
r.get::<_, String>(1)? + "/" + r.get_ref(0)?.as_str()?,
r.get::<_, String>(2)?,
))
})
.log("fetch deps query")
.into_iter()
.flatten()
.filter_map(|r| r.log("deps row"))
.collect_into(&mut deps);
}
});
axum::Json(deps)
}
#[derive(Deserialize)]
#[serde(untagged)]
enum Feed {
Before { before_timestamp: u64 },
}
#[derive(Deserialize)]
struct Before {
before_timestamp: u64,
}
impl Feed {
async fn more(session: Session, axum::Form(data): axum::Form<Before>) -> Html<String> {
Self::Before { before_timestamp: data.before_timestamp }.render(&session)
}
}
impl Default for Feed {
fn default() -> Self {
Self::Before { before_timestamp: now() + 3600 }
}
}
impl Page for Feed {
fn render_to_buf(self, _: &Session, buf: &mut String) {
db::with(|db| {
let cursor = match self {
Feed::Before { before_timestamp } => db
.get_pots_before
.query_map((before_timestamp,), Post::from_row)
.log("fetch before posts query")
.into_iter()
.flatten()
.filter_map(|r| r.log("fetch before posts row")),
};
let base_len = buf.len();
let mut last_timestamp = None;
for post in cursor {
write!(buf, "{}", post).unwrap();
if buf.len() - base_len > MAX_FEED_SIZE {
last_timestamp = Some(post.timestamp);
break;
}
}
write_html!((*buf)
if let Some(last_timestamp) = last_timestamp {
<div "hx-post"="/feed-more"
"hx-trigger"="intersect once"
"hx-swap"="outerHTML"
"hx-vals"={format_args!("{{\"before_timestamp\":{last_timestamp}}}")}
>"there might be more"</div>
} else {
"no more stuff"
}
);
});
}
}
#[derive(Default)]
struct Index;
impl PublicPage for Index {
fn render_to_buf(self, buf: &mut String) {
buf.push_str(include_str!("welcome-page.html"));
}
}
#[derive(Deserialize, Default)]
struct Post {
author: String,
name: String,
#[serde(skip)]
timestamp: u64,
#[serde(skip)]
imports: usize,
#[serde(skip)]
runs: usize,
#[serde(skip)]
dependencies: usize,
code: String,
#[serde(skip)]
error: Option<&'static str>,
}
impl Page for Post {
fn render_to_buf(self, session: &Session, buf: &mut String) {
let Self { name, code, error, .. } = self;
write_html! { (buf)
<form id="postForm" "hx-post"="/post" "hx-swap"="outerHTML">
if let Some(e) = error { <div class="error">e</div> }
<input name="author" type="text" value={session.name} hidden>
<input name="name" type="text" placeholder="name" value=name
required maxlength=MAX_POSTNAME_LENGTH>
<div id="code-editor">
<textarea id="code-edit" name="code" placeholder="code" rows=1
required>code</textarea>
<span id="code-size">MAX_CODE_LENGTH</span>
</div>
<input type="submit" value="submit">
<pre id="compiler-output"></pre>
</form>
!{include_str!("post-page.html")}
}
}
}
#[derive(Deserialize)]
struct Run {
author: String,
name: String,
}
impl Post {
pub fn from_row(r: &rusqlite::Row) -> rusqlite::Result<Self> {
Ok(Post {
author: r.get(0)?,
name: r.get(1)?,
timestamp: r.get(2)?,
code: r.get(3)?,
imports: r.get(4)?,
runs: r.get(5)?,
..Default::default()
})
}
async fn run(
session: Session,
axum::extract::Query(run): axum::extract::Query<Run>,
) -> StatusCode {
match db::with(|qes| qes.creata_run.insert((run.name, run.author, session.name))) {
Ok(_) => StatusCode::OK,
Err(e) => {
log::error!("creating run record failed: {e}");
StatusCode::INTERNAL_SERVER_ERROR
}
}
}
async fn post(
session: Session,
axum::Form(mut data): axum::Form<Self>,
) -> Result<Redirect, Html<String>> {
if data.name.len() > MAX_POSTNAME_LENGTH {
data.error = Some(formatcp!("name too long, max length is {MAX_POSTNAME_LENGTH}"));
return Err(data.render(&session));
}
if data.code.len() > MAX_CODE_LENGTH {
data.error = Some(formatcp!("code too long, max length is {MAX_CODE_LENGTH}"));
return Err(data.render(&session));
}
db::with(|db| {
if let Err(e) = db.create_post.insert((&data.name, &session.name, now(), &data.code)) {
if let rusqlite::Error::SqliteFailure(e, _) = e {
if e.code == rusqlite::ErrorCode::ConstraintViolation {
data.error = Some("this name is already used");
}
}
data.error = data.error.or_else(|| {
log::error!("create post error: {e}");
Some("internal server error")
});
return;
}
for (author, name) in hblang::lexer::Lexer::uses(&data.code)
.filter_map(|v| v.split_once('/'))
.collect::<HashSet<_>>()
{
if db
.create_import
.insert((author, name, &session.name, &data.name))
.log("create import query")
.is_none()
{
data.error = Some("internal server error");
return;
};
}
});
if data.error.is_some() {
Err(data.render(&session))
} else {
Ok(redirect("/profile"))
}
}
}
impl fmt::Display for Post {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let Self { author, name, timestamp, imports, runs, dependencies, code, .. } = self;
write_html! { f <div class="preview" "data-author"=author "data-name"=name>
<div class="info">
<span>
<a "hx-get"={format_args!("/profile-view/{author}")} href="" "hx-target"="main"
"hx-push-url"={format_args!("/profile/{author}")}
"hx-swam"="innerHTML">author</a>
"/"
name
</span>
<span apply="timestamp">timestamp</span>
for (name, count) in [include_str!("icons/download.svg"), include_str!("icons/run.svg"), "deps"]
.iter()
.zip([imports, runs, dependencies])
.filter(|(_, &c)| c != 0)
{
<div class="stat">!name count</div>
}
</div>
<div class="code">
<nav>
<button onmousedown="runPost(this)">!{include_str!("icons/run.svg")}</button>
</nav>
<pre apply="fmt">code</pre>
</div>
<pre hidden id="compiler-output"></pre>
if *timestamp == 0 {
<button "hx-get"="/post" "hx-swap"="outerHTML"
"hx-target"="[preview]">"edit"</button>
}
</div> }
Ok(())
}
}
#[derive(Deserialize, Default)]
struct PasswordChange {
old_password: String,
new_password: String,
#[serde(skip)]
error: Option<&'static str>,
}
impl PasswordChange {
async fn post(
session: Session,
axum::Form(mut change): axum::Form<PasswordChange>,
) -> Html<String> {
db::with(|que| {
match que.authenticate.query_row((&session.name,), |r| r.get::<_, String>(1)) {
Ok(hash) if verify_password(&hash, &change.old_password).is_err() => {
change.error = Some("invalid credentials");
}
Ok(_) => {
let new_hashed = hash_password(&change.new_password);
match que
.change_passowrd
.execute((new_hashed, &session.name))
.log("execute update")
{
None => change.error = Some("intenal server error"),
Some(0) => change.error = Some("password is incorrect"),
Some(_) => {}
}
}
Err(rusqlite::Error::QueryReturnedNoRows) => {
change.error = Some("invalid credentials");
}
Err(e) => {
log::error!("login queri failed: {e}");
change.error = Some("internal server error");
}
}
});
if change.error.is_some() {
change.render(&session)
} else {
PasswordChange::default().render(&session)
}
}
}
impl Page for PasswordChange {
fn render_to_buf(self, _: &Session, buf: &mut String) {
let Self { old_password, new_password, error } = self;
write_html! { (buf)
<form "hx-post"="/profile/password" "hx-swap"="outerHTML">
if let Some(e) = error { <div class="error">e</div> }
<input name="old_password" type="password" autocomplete="old-password"
placeholder="old password" value=old_password>
<input name="new_password" type="password" autocomplete="new-password" placeholder="new password"
value=new_password>
<input type="submit" value="submit">
</form>
}
}
}
#[derive(Default)]
struct Profile {
other: Option<String>,
}
impl Profile {
async fn get_other(session: Session, Path(name): Path<String>) -> Html<String> {
Profile { other: Some(name) }.render(&session)
}
async fn get_other_page(session: Session, Path(name): Path<String>) -> Html<String> {
base(|b| Profile { other: Some(name) }.render_to_buf(&session, b), Some(&session))
}
}
impl Page for Profile {
fn render_to_buf(self, session: &Session, buf: &mut String) {
db::with(|db| {
let name = self.other.as_ref().unwrap_or(&session.name);
let iter = db
.get_user_posts
.query_map((name,), Post::from_row)
.log("get user posts query")
.into_iter()
.flatten()
.filter_map(|p| p.log("user post row"));
write_html! { (*buf)
if name == &session.name {
|b|{PasswordChange::default().render_to_buf(session, b)}
}
for post in iter {
!{post}
} else {
"no posts"
}
!{include_str!("profile-page.html")}
}
})
}
}
fn hash_password(password: &str) -> String {
use argon2::PasswordHasher;
argon2::Argon2::default()
.hash_password(password.as_bytes(), &SaltString::generate(&mut OsRng))
.unwrap()
.to_string()
}
fn verify_password(hash: &str, password: &str) -> Result<(), argon2::password_hash::Error> {
argon2::Argon2::default()
.verify_password(password.as_bytes(), &argon2::PasswordHash::new(hash)?)
}
#[derive(Serialize, Deserialize, Default, Debug)]
struct Login {
name: String,
password: String,
#[serde(skip)]
error: Option<&'static str>,
}
impl PublicPage for Login {
fn render_to_buf(self, buf: &mut String) {
let Self { name, password, error } = self;
write_html! { (buf)
<form "hx-post"="/login" "hx-swap"="outerHTML">
if let Some(e) = error { <div class="error">e</div> }
<input name="name" type="text" autocomplete="name" placeholder="name" value=name
required maxlength=MAX_NAME_LENGTH>
<input name="password" type="password" autocomplete="current-password" placeholder="password"
value=password>
<input type="submit" value="submit">
</form>
}
}
}
impl Login {
async fn post(
axum::Form(mut data): axum::Form<Self>,
) -> Result<AppendHeaders<[(&'static str, String); 2]>, Html<String>> {
// TODO: hash password
let mut id = [0u8; 32];
db::with(|db| match db.authenticate.query_row((&data.name,), |r| r.get::<_, String>(1)) {
Ok(hash) => {
if verify_password(&hash, &data.password).is_err() {
data.error = Some("invalid credentials");
} else {
getrandom::getrandom(&mut id).unwrap();
if db
.login
.insert((id, &data.name, now() + SESSION_DURATION_SECS))
.log("create session query")
.is_none()
{
data.error = Some("internal server error");
}
}
}
Err(rusqlite::Error::QueryReturnedNoRows) => {
data.error = Some("invalid credentials");
}
Err(e) => {
log::error!("login queri failed: {e}");
data.error = Some("internal server error");
}
});
if data.error.is_some() {
Err(data.render())
} else {
Ok(AppendHeaders([
("hx-location", "/feed".into()),
(
"set-cookie",
format!(
"id={}; SameSite=Strict; Secure; Max-Age={SESSION_DURATION_SECS}",
to_hex(&id)
),
),
]))
}
}
async fn delete(session: Session) -> Redirect {
_ = db::with(|q| q.logout.execute((session.id,)).log("delete session query"));
redirect("/login")
}
}
#[derive(Serialize, Deserialize, Default)]
struct Signup {
name: String,
new_password: String,
confirm_password: String,
#[serde(default)]
confirm_no_password: bool,
#[serde(skip)]
error: Option<&'static str>,
}
impl PublicPage for Signup {
fn render_to_buf(self, buf: &mut String) {
let Signup { name, new_password, confirm_password, confirm_no_password, error } = self;
let vals = if confirm_no_password { "{\"confirm_no_password\":true}" } else { "{}" };
write_html! { (buf)
<form "hx-post"="/signup" "hx-swap"="outerHTML" "hx-vals"=vals>
if let Some(e) = error { <div class="error">e</div> }
<input name="name" type="text" autocomplete="name" placeholder="name" value=name
maxlength=MAX_NAME_LENGTH required>
<input name="new_password" type="password" autocomplete="new-password" placeholder="new password"
value=new_password>
<input name="confirm_password" type="password" autocomplete="confirm-password"
placeholder="confirm password" value=confirm_password>
<input type="submit" value="submit">
</form>
}
}
}
impl Signup {
async fn post(axum::Form(mut data): axum::Form<Self>) -> Result<Redirect, Html<String>> {
if data.name.len() > MAX_NAME_LENGTH {
data.error = Some(formatcp!("name too long, max length is {MAX_NAME_LENGTH}"));
return Err(data.render());
}
if !data.confirm_no_password && data.new_password.is_empty() {
data.confirm_no_password = true;
data.error = Some("Are you sure you don't want to use a password? (then submit again)");
return Err(data.render());
}
db::with(|db| {
// TODO: hash passwords
match db.register.insert((&data.name, hash_password(&data.new_password))) {
Ok(_) => {}
Err(rusqlite::Error::SqliteFailure(e, _))
if e.code == rusqlite::ErrorCode::ConstraintViolation =>
{
data.error = Some("username already taken");
}
Err(e) => {
log::error!("create user query: {e}");
data.error = Some("internal server error");
}
};
});
if data.error.is_some() {
Err(data.render())
} else {
Ok(redirect("/login"))
}
}
}
fn base(body: impl FnOnce(&mut String), session: Option<&Session>) -> Html<String> {
let username = session.map(|s| &s.name);
let nav_button = |f: &mut String, name: &str| {
write_html! {(f)
<button "hx-push-url"={format_args!("/{name}")}
"hx-get"={format_args!("/{name}-view")}
"hx-target"="main"
"hx-swap"="innerHTML">name</button>
}
};
Html(html! {
"<!DOCTYPE html>"
<html lang="en">
<head>
<meta name="charset" content="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<meta name="description" content="code dependency hell socila media hblang">
<link rel="stylesheet" href="/index.css">
<title>"depell"</title>
</head>
<body hidden>
<nav>
<button "hx-push-url"="/" "hx-get"="/index-view" "hx-target"="main" "hx-swap"="innerHTML">"depell"</button>
<section>
if let Some(username) = username {
<button "hx-push-url"={format_args!("/profile/{username}")} "hx-get"="/profile-view" "hx-target"="main"
"hx-swap"="innerHTML">username</button>
|f|{nav_button(f, "feed"); nav_button(f, "post")}
<button "hx-delete"="/login">"logout"</button>
} else {
|f|{nav_button(f, "login"); nav_button(f, "signup")}
}
</section>
</nav>
<section id="post-form"></section>
<main>|f|{body(f)}</main>
</body>
<script src="https://unpkg.com/htmx.org@2.0.3/dist/htmx.min.js" integrity="sha384-0895/pl2MU10Hqc6jd4RvrthNlDiE9U1tWmX7WRESftEDRosgxNsQG/Ze9YMRzHq" crossorigin="anonymous"></script>
<script type="module" src="/index.js"></script>
</html>
})
}
struct Session {
name: String,
id: [u8; 32],
}
#[axum::async_trait]
impl<S> axum::extract::FromRequestParts<S> for Session {
/// If the extractor fails it'll use this "rejection" type. A rejection is
/// a kind of error that can be converted into a response.
type Rejection = Redirect;
/// Perform the extraction.
async fn from_request_parts(parts: &mut Parts, _: &S) -> Result<Self, Self::Rejection> {
let err = redirect("/login");
let value = parts
.headers
.get_all(COOKIE)
.into_iter()
.find_map(|c| c.to_str().ok()?.trim().strip_prefix("id="))
.map(|c| c.split_once(';').unwrap_or((c, "")).0)
.ok_or(err)?;
let mut id = [0u8; 32];
parse_hex(value, &mut id).ok_or(err)?;
let (name, expiration) = db::with(|db| {
db.get_session
.query_row((id,), |r| Ok((r.get::<_, String>(0)?, r.get::<_, u64>(1)?)))
.log("fetching session")
.ok_or(err)
})?;
if expiration < now() {
return Err(err);
}
Ok(Self { name, id })
}
}
fn now() -> u64 {
std::time::SystemTime::now()
.duration_since(std::time::SystemTime::UNIX_EPOCH)
.unwrap()
.as_secs()
}
fn parse_hex(hex: &str, dst: &mut [u8]) -> Option<()> {
fn hex_to_nibble(b: u8) -> Option<u8> {
Some(match b {
b'a'..=b'f' => b - b'a' + 10,
b'A'..=b'F' => b - b'A' + 10,
b'0'..=b'9' => b - b'0',
_ => return None,
})
}
if hex.len() != dst.len() * 2 {
return None;
}
for (d, p) in dst.iter_mut().zip(hex.as_bytes().chunks_exact(2)) {
*d = (hex_to_nibble(p[0])? << 4) | hex_to_nibble(p[1])?;
}
Some(())
}
fn to_hex(src: &[u8]) -> String {
use std::fmt::Write;
let mut buf = String::new();
for &b in src {
write!(buf, "{b:02x}").unwrap()
}
buf
}
fn main() {
tokio::runtime::Builder::new_current_thread().enable_all().build().unwrap().block_on(amain());
}
mod db {
use std::cell::RefCell;
macro_rules! gen_queries {
($vis:vis struct $name:ident {
$($qname:ident: $code:expr,)*
}) => {
$vis struct $name<'a> {
$($vis $qname: rusqlite::Statement<'a>,)*
}
impl<'a> $name<'a> {
fn new(db: &'a rusqlite::Connection) -> Self {
Self {
$($qname: db.prepare($code).unwrap(),)*
}
}
}
};
}
gen_queries! {
pub struct Queries {
register: "INSERT INTO user (name, password_hash) VALUES(?, ?)",
change_passowrd: "UPDATE user SET password_hash = ? WHERE name = ?",
authenticate: "SELECT name, password_hash FROM user WHERE name = ?",
login: "INSERT OR REPLACE INTO session (id, username, expiration) VALUES(?, ?, ?)",
logout: "DELETE FROM session WHERE id = ?",
get_session: "SELECT username, expiration FROM session WHERE id = ?",
get_user_posts: "SELECT author, name, timestamp, code, (
WITH RECURSIVE roots(name, author, code) AS (
SELECT name, author, code FROM post WHERE name = outher.name AND author = outher.author
UNION
SELECT post.name, post.author, post.code FROM
post JOIN import ON post.name = import.from_name
AND post.author = import.from_author
JOIN roots ON import.to_name = roots.name
AND import.to_author = roots.author
) SELECT (count(*) - 1) FROM roots
) AS imports, (
WITH RECURSIVE roots(name, author, code) AS (
SELECT name, author, code FROM post WHERE name = outher.name AND author = outher.author
UNION
SELECT post.name, post.author, post.code FROM post
JOIN import ON post.name = import.from_name
AND post.author = import.from_author
JOIN roots ON import.to_name = roots.name
AND import.to_author = roots.author
) SELECT count(*) FROM roots
JOIN run ON roots.name = run.code_name
AND roots.author = run.code_author
) AS runs FROM post as outher WHERE author = ? ORDER BY timestamp DESC",
// TODO: we might want to cache the recursive queries
get_pots_before: "SELECT author, name, timestamp, code, (
WITH RECURSIVE roots(name, author, code) AS (
SELECT name, author, code FROM post WHERE name = outher.name AND author = outher.author
UNION
SELECT post.name, post.author, post.code FROM
post JOIN import ON post.name = import.from_name
AND post.author = import.from_author
JOIN roots ON import.to_name = roots.name
AND import.to_author = roots.author
) SELECT (count(*) - 1) FROM roots
) AS imports, (
WITH RECURSIVE roots(name, author, code) AS (
SELECT name, author, code FROM post WHERE name = outher.name AND author = outher.author
UNION
SELECT post.name, post.author, post.code FROM post
JOIN import ON post.name = import.from_name
AND post.author = import.from_author
JOIN roots ON import.to_name = roots.name
AND import.to_author = roots.author
) SELECT count(*) FROM roots
JOIN run ON roots.name = run.code_name
AND roots.author = run.code_author
) as runs FROM post AS outher WHERE timestamp < ?",
create_post: "INSERT INTO post (name, author, timestamp, code) VALUES(?, ?, ?, ?)",
fetch_deps: "
WITH RECURSIVE roots(name, author, code) AS (
SELECT name, author, code FROM post WHERE name = ? AND author = ?
UNION
SELECT post.name, post.author, post.code FROM
post JOIN import ON post.name = import.to_name
AND post.author = import.to_author
JOIN roots ON import.from_name = roots.name
AND import.from_author = roots.author
) SELECT * FROM roots;
",
create_import: "INSERT INTO import(to_author, to_name, from_author, from_name)
VALUES(?, ?, ?, ?)",
creata_run: "INSERT OR IGNORE INTO run(code_name, code_author, runner) VALUES(?, ?, ?)",
}
}
struct Db {
queries: Queries<'static>,
_db: Box<rusqlite::Connection>,
}
impl Db {
fn new() -> Self {
let db = Box::new(rusqlite::Connection::open("db.sqlite").unwrap());
Self {
queries: Queries::new(unsafe {
std::mem::transmute::<&rusqlite::Connection, &rusqlite::Connection>(&db)
}),
_db: db,
}
}
}
pub fn with<T>(with: impl FnOnce(&mut Queries) -> T) -> T {
thread_local! { static DB_CONN: RefCell<Db> = RefCell::new(Db::new()); }
DB_CONN.with_borrow_mut(|q| with(&mut q.queries))
}
pub fn init() {
const SCHEMA_VERSION: usize = 0;
const MIGRATIONS: &[&str] = &[include_str!("migrations/1.sql")];
let db = rusqlite::Connection::open("db.sqlite").unwrap();
db.execute_batch(include_str!("schema.sql")).unwrap();
let schema_version =
db.pragma_query_value(None, "user_version", |v| v.get::<_, usize>(0)).unwrap();
if schema_version != SCHEMA_VERSION {
for &mig in &MIGRATIONS[schema_version..] {
db.execute_batch(mig).expect(mig);
}
db.pragma_update(None, "user_version", SCHEMA_VERSION).unwrap();
}
Queries::new(&db);
}
}
fn redirect(to: &'static str) -> Redirect {
AppendHeaders([("hx-location", to)])
}
trait PublicPage: Default {
fn render_to_buf(self, buf: &mut String);
fn render(self) -> Html<String> {
let mut str = String::new();
self.render_to_buf(&mut str);
Html(str)
}
async fn get() -> Html<String> {
Self::default().render()
}
async fn page(session: Option<Session>) -> Html<String> {
base(|s| Self::default().render_to_buf(s), session.as_ref())
}
}
trait Page: Default {
fn render_to_buf(self, session: &Session, buf: &mut String);
fn render(self, session: &Session) -> Html<String> {
let mut str = String::new();
self.render_to_buf(session, &mut str);
Html(str)
}
async fn get(session: Session) -> Html<String> {
Self::default().render(&session)
}
async fn page(session: Option<Session>) -> Result<Html<String>, axum::response::Redirect> {
match session {
Some(session) => {
Ok(base(|f| Self::default().render_to_buf(&session, f), Some(&session)))
}
None => Err(axum::response::Redirect::permanent("/login")),
}
}
}
trait ResultExt<O, E> {
fn log(self, prefix: impl Display) -> Option<O>;
}
impl<O, E: Display> ResultExt<O, E> for Result<O, E> {
fn log(self, prefix: impl Display) -> Option<O> {
match self {
Ok(v) => Some(v),
Err(e) => {
log::error!("{prefix}: {e}");
None
}
}
}
}
struct Logger;
impl log::Log for Logger {
fn enabled(&self, _: &log::Metadata) -> bool {
true
}
fn log(&self, record: &log::Record) {
if self.enabled(record.metadata()) {
eprintln!("{} - {}", record.module_path().unwrap_or("=="), record.args());
}
}
fn flush(&self) {}
}

View file

@ -0,0 +1 @@

21
depell/src/post-page.html Normal file
View file

@ -0,0 +1,21 @@
<div id="dep-list">
<input placeholder="search impoted deps.." oninput="filterCodeDeps(this, event)">
<section id="deps">
results show here...
</section>
</div>
<div>
<h3>About posting code</h3>
<p>
If you are unfammiliar with <a href="https://git.ablecorp.us/AbleOS/holey-bytes">hblang</a>, refer to the
<strong>hblang/README.md</strong> or
vizit <a href="/profile/mlokis">mlokis'es posts</a>. Preferably don't edit the code here.
</p>
<h3>Extra textarea features</h3>
<ul>
<li>proper tab behaviour</li>
<li>snap to previous tab boundary on "empty" lines</li>
</ul>

55
depell/src/schema.sql Normal file
View file

@ -0,0 +1,55 @@
PRAGMA foreign_keys = ON;
CREATE TABLE IF NOT EXISTS user(
name TEXT NOT NULL,
password_hash TEXT NOT NULL,
PRIMARY KEY (name)
) WITHOUT ROWID;
CREATE TABLE IF NOT EXISTS session(
id BLOB NOT NULL,
username TEXT NOT NULL,
expiration INTEGER NOT NULL,
FOREIGN KEY (username) REFERENCES user (name)
PRIMARY KEY (username)
) WITHOUT ROWID;
CREATE UNIQUE INDEX IF NOT EXISTS
session_id ON session (id);
CREATE TABLE IF NOT EXISTS post(
name TEXT NOT NULL,
author TEXT,
timestamp INTEGER,
code TEXT NOT NULL,
FOREIGN KEY (author) REFERENCES user(name) ON DELETE SET NULL,
PRIMARY KEY (author, name)
);
CREATE INDEX IF NOT EXISTS
post_timestamp ON post(timestamp DESC);
CREATE TABLE IF NOT EXISTS import(
from_name TEXT NOT NULL,
from_author TEXT,
to_name TEXT NOT NULL,
to_author TEXT,
FOREIGN KEY (from_name, from_author) REFERENCES post(name, author),
FOREIGN KEY (to_name, to_author) REFERENCES post(name, author)
);
CREATE INDEX IF NOT EXISTS
dependencies ON import(from_name, from_author);
CREATE INDEX IF NOT EXISTS
dependants ON import(to_name, to_author);
CREATE TABLE IF NOT EXISTS run(
code_name TEXT NOT NULL,
code_author TEXT NOT NULL,
runner TEXT NOT NULL,
FOREIGN KEY (code_name, code_author) REFERENCES post(name, author),
FOREIGN KEY (runner) REFERENCES user(name),
PRIMARY KEY (code_name, code_author, runner)
);

View file

@ -0,0 +1,17 @@
<h1>Welcome to depell</h1>
<p>
Depell (dependency hell) is a simple "social" media site best compared to twitter, except that all you can post is
<a href="https://git.ablecorp.us/AbleOS/holey-bytes">hblang</a> code with no comments allowed. Instead of likes you
run the program, and instead of retweets you import the program as dependency. Run counts even when ran indirectly.
</p>
<p>
The backend only serves the code and frontend compiles and runs it locally. All posts are immutable.
</p>
<h2>Security?</h2>
<p>
All code runs in WASM (inside a holey-bytes VM until hblang compiles to wasm) and is controlled by JavaScript. WASM
cant do any form of IO without going trough JavaScript so as long as JS import does not allow wasm to execute
arbitrary JS code, WASM can act as a container inside the JS.
</p>

View file

@ -0,0 +1,11 @@
[package]
name = "wasm-hbfmt"
version = "0.1.0"
edition = "2021"
[lib]
crate-type = ["cdylib"]
[dependencies]
hblang = { workspace = true, features = ["no_log"] }
wasm-rt = { version = "0.1.0", path = "../wasm-rt" }

View file

@ -0,0 +1,42 @@
#![no_std]
#![feature(str_from_raw_parts)]
#![feature(alloc_error_handler)]
use hblang::{fmt, parser};
wasm_rt::decl_runtime!(128 * 1024, 1024 * 4);
const MAX_OUTPUT_SIZE: usize = 1024 * 10;
wasm_rt::decl_buffer!(MAX_OUTPUT_SIZE, MAX_OUTPUT, OUTPUT, OUTPUT_LEN);
const MAX_INPUT_SIZE: usize = 1024 * 4;
wasm_rt::decl_buffer!(MAX_INPUT_SIZE, MAX_INPUT, INPUT, INPUT_LEN);
#[no_mangle]
unsafe extern "C" fn fmt() {
ALLOCATOR.reset();
let code = core::str::from_raw_parts(core::ptr::addr_of!(INPUT).cast(), INPUT_LEN);
let arena = parser::Arena::with_capacity(code.len() * parser::SOURCE_TO_AST_FACTOR);
let mut ctx = parser::Ctx::default();
let exprs = parser::Parser::parse(&mut ctx, code, "source.hb", &mut parser::no_loader, &arena);
let mut f = wasm_rt::Write(&mut OUTPUT[..]);
fmt::fmt_file(exprs, code, &mut f).unwrap();
OUTPUT_LEN = MAX_OUTPUT_SIZE - f.0.len();
}
#[no_mangle]
unsafe extern "C" fn tok() {
let code = core::slice::from_raw_parts_mut(
core::ptr::addr_of_mut!(OUTPUT).cast(), OUTPUT_LEN);
OUTPUT_LEN = fmt::get_token_kinds(code);
}
#[no_mangle]
unsafe extern "C" fn minify() {
let code = core::str::from_raw_parts_mut(
core::ptr::addr_of_mut!(OUTPUT).cast(), OUTPUT_LEN);
OUTPUT_LEN = fmt::minify(code);
}

View file

@ -0,0 +1,14 @@
[package]
name = "wasm-hbc"
version = "0.1.0"
edition = "2021"
[lib]
crate-type = ["cdylib"]
[dependencies]
hblang = { workspace = true, features = [] }
hbvm.workspace = true
log = { version = "0.4.22", features = ["release_max_level_error"] }
wasm-rt = { version = "0.1.0", path = "../wasm-rt", features = ["log"] }

127
depell/wasm-hbc/src/lib.rs Normal file
View file

@ -0,0 +1,127 @@
#![feature(alloc_error_handler)]
#![feature(slice_take)]
#![no_std]
use {
alloc::{string::String, vec::Vec},
core::ffi::CStr,
hblang::{
son::{hbvm::HbvmBackend, Codegen, CodegenCtx},
ty::Module,
Ent,
},
};
extern crate alloc;
const ARENA_CAP: usize = 128 * 16 * 1024;
wasm_rt::decl_runtime!(ARENA_CAP, 1024 * 4);
const MAX_INPUT_SIZE: usize = 32 * 4 * 1024;
wasm_rt::decl_buffer!(MAX_INPUT_SIZE, MAX_INPUT, INPUT, INPUT_LEN);
#[no_mangle]
unsafe fn compile_and_run(mut fuel: usize) {
ALLOCATOR.reset();
_ = log::set_logger(&wasm_rt::Logger);
log::set_max_level(log::LevelFilter::Error);
struct File<'a> {
path: &'a str,
code: &'a mut str,
}
let mut root = 0;
let files = {
let mut input_bytes =
core::slice::from_raw_parts_mut(core::ptr::addr_of_mut!(INPUT).cast::<u8>(), INPUT_LEN);
let mut files = Vec::with_capacity(32);
while let Some((&mut path_len, rest)) = input_bytes.split_first_chunk_mut() {
let (path, rest) = rest.split_at_mut(u16::from_le_bytes(path_len) as usize);
let (&mut code_len, rest) = rest.split_first_chunk_mut().unwrap();
let (code, rest) = rest.split_at_mut(u16::from_le_bytes(code_len) as usize);
files.push(File {
path: core::str::from_utf8_unchecked(path),
code: core::str::from_utf8_unchecked_mut(code),
});
input_bytes = rest;
}
let root_path = files[root].path;
hblang::quad_sort(&mut files, |a, b| a.path.cmp(b.path));
root = files.binary_search_by_key(&root_path, |p| p.path).unwrap();
files
};
let mut ctx = CodegenCtx::default();
let files = {
let paths = files.iter().map(|f| f.path).collect::<Vec<_>>();
let mut loader = |path: &str, _: &str, kind| match kind {
hblang::parser::FileKind::Module => Ok(paths.binary_search(&path).unwrap()),
hblang::parser::FileKind::Embed => Err("embeds are not supported".into()),
};
files
.into_iter()
.map(|f| {
hblang::parser::Ast::new(
f.path,
// since 'free' does nothing this is fine
String::from_raw_parts(f.code.as_mut_ptr(), f.code.len(), f.code.len()),
&mut ctx.parser,
&mut loader,
)
})
.collect::<Vec<_>>()
};
let mut ct = {
let mut backend = HbvmBackend::default();
Codegen::new(&mut backend, &files, &mut ctx).generate(Module::new(root));
if !ctx.parser.errors.borrow().is_empty() {
log::error!("{}", ctx.parser.errors.borrow());
return;
}
let mut c = Codegen::new(&mut backend, &files, &mut ctx);
c.assemble_comptime()
};
while fuel != 0 {
match ct.vm.run() {
Ok(hbvm::VmRunOk::End) => {
log::error!("exit code: {}", ct.vm.read_reg(1).0 as i64);
break;
}
Ok(hbvm::VmRunOk::Ecall) => {
let kind = ct.vm.read_reg(2).0;
match kind {
0 => {
let str = ct.vm.read_reg(3).0;
let str = unsafe { CStr::from_ptr(str as _) };
log::error!("{}", str.to_str().unwrap());
}
unknown => log::error!("unknown ecall: {unknown}"),
}
}
Ok(hbvm::VmRunOk::Timer) => {
fuel -= 1;
if fuel == 0 {
log::error!("program timed out");
}
}
Ok(hbvm::VmRunOk::Breakpoint) => todo!(),
Err(e) => {
log::error!("vm error: {e}");
break;
}
}
}
//log::error!("memory consumption: {}b / {}b", ALLOCATOR.used(), ARENA_CAP);
}

View file

@ -0,0 +1,7 @@
[package]
name = "wasm-rt"
version = "0.1.0"
edition = "2021"
[dependencies]
log = { version = "0.4.22", optional = true }

162
depell/wasm-rt/src/lib.rs Normal file
View file

@ -0,0 +1,162 @@
#![feature(alloc_error_handler)]
#![feature(pointer_is_aligned_to)]
#![feature(slice_take)]
#![no_std]
use core::{
alloc::{GlobalAlloc, Layout},
cell::UnsafeCell,
};
extern crate alloc;
#[macro_export]
macro_rules! decl_buffer {
($cap:expr, $export_cap:ident, $export_base:ident, $export_len:ident) => {
#[no_mangle]
static $export_cap: usize = $cap;
#[no_mangle]
static mut $export_base: [u8; $cap] = [0; $cap];
#[no_mangle]
static mut $export_len: usize = 0;
};
}
#[macro_export]
macro_rules! decl_runtime {
($memory_size:expr, $max_panic_size:expr) => {
#[cfg(debug_assertions)]
#[no_mangle]
static mut PANIC_MESSAGE: [u8; $max_panic_size] = [0; $max_panic_size];
#[cfg(debug_assertions)]
#[no_mangle]
static mut PANIC_MESSAGE_LEN: usize = 0;
#[cfg(target_arch = "wasm32")]
#[panic_handler]
pub fn handle_panic(_info: &core::panic::PanicInfo) -> ! {
#[cfg(debug_assertions)]
{
unsafe {
use core::fmt::Write;
let mut f = $crate::Write(&mut PANIC_MESSAGE[..]);
_ = writeln!(f, "{}", _info);
PANIC_MESSAGE_LEN = $max_panic_size - f.0.len();
}
}
core::arch::wasm32::unreachable();
}
#[global_allocator]
static ALLOCATOR: $crate::ArenaAllocator<{ $memory_size }> = $crate::ArenaAllocator::new();
#[cfg(target_arch = "wasm32")]
#[alloc_error_handler]
fn alloc_error(_: core::alloc::Layout) -> ! {
#[cfg(debug_assertions)]
{
unsafe {
use core::fmt::Write;
let mut f = $crate::Write(&mut PANIC_MESSAGE[..]);
_ = writeln!(f, "out of memory");
PANIC_MESSAGE_LEN = $max_panic_size - f.0.len();
}
}
core::arch::wasm32::unreachable()
}
};
}
#[cfg(feature = "log")]
pub struct Logger;
#[cfg(feature = "log")]
impl log::Log for Logger {
fn enabled(&self, _: &log::Metadata) -> bool {
true
}
fn log(&self, record: &log::Record) {
if self.enabled(record.metadata()) {
const MAX_LOG_MESSAGE: usize = 1024 * 8;
#[no_mangle]
static mut LOG_MESSAGES: [u8; MAX_LOG_MESSAGE] = [0; MAX_LOG_MESSAGE];
#[no_mangle]
static mut LOG_MESSAGES_LEN: usize = 0;
unsafe {
use core::fmt::Write;
let mut f = Write(&mut LOG_MESSAGES[LOG_MESSAGES_LEN..]);
_ = writeln!(f, "{}", record.args());
LOG_MESSAGES_LEN = MAX_LOG_MESSAGE - f.0.len();
}
}
}
fn flush(&self) {}
}
pub struct ArenaAllocator<const SIZE: usize> {
arena: UnsafeCell<[u8; SIZE]>,
head: UnsafeCell<*mut u8>,
}
impl<const SIZE: usize> ArenaAllocator<SIZE> {
#[expect(clippy::new_without_default)]
pub const fn new() -> Self {
ArenaAllocator {
arena: UnsafeCell::new([0; SIZE]),
head: UnsafeCell::new(core::ptr::null_mut()),
}
}
#[expect(clippy::missing_safety_doc)]
pub unsafe fn reset(&self) {
(*self.head.get()) = self.arena.get().cast::<u8>().add(SIZE);
}
pub fn used(&self) -> usize {
unsafe { self.arena.get() as usize + SIZE - (*self.head.get()) as usize }
}
}
unsafe impl<const SIZE: usize> Sync for ArenaAllocator<SIZE> {}
unsafe impl<const SIZE: usize> GlobalAlloc for ArenaAllocator<SIZE> {
unsafe fn alloc(&self, layout: Layout) -> *mut u8 {
let size = layout.size();
let align = layout.align();
let until = self.arena.get() as *mut u8;
let new_head = (*self.head.get()).sub(size);
let aligned_head = (new_head as usize & !(align - 1)) as *mut u8;
debug_assert!(aligned_head.is_aligned_to(align));
if until > aligned_head {
return core::ptr::null_mut();
}
*self.head.get() = aligned_head;
aligned_head
}
unsafe fn dealloc(&self, _ptr: *mut u8, _layout: Layout) {
/* lol */
}
}
pub struct Write<'a>(pub &'a mut [u8]);
impl core::fmt::Write for Write<'_> {
fn write_str(&mut self, s: &str) -> core::fmt::Result {
if let Some(m) = self.0.take_mut(..s.len()) {
m.copy_from_slice(s.as_bytes());
Ok(())
} else {
Err(core::fmt::Error)
}
}
}

View file

@ -1,9 +0,0 @@
[package]
name = "hbasm"
version = "0.1.0"
edition = "2021"
[dependencies]
paste = "1.0"
rhai = "1.16"
with_builtin_macros = "0.0.3"

View file

@ -1,13 +0,0 @@
import "hbasm/examples/ableos/std" as std;
fn main(){
std::Error(":+)");
std::Warn("Your mom fell in a well!");
std::Info("Hello, world!");
std::Debug("ABC");
std::Trace("Trace Deez");
tx();
}
main();

View file

@ -1,24 +0,0 @@
fn ipc_send(buffer_id, mem_addr, length){
// set the ecall
li8(r1, 3);
// Set the buffer ID to be the BufferID
li64(r2, buffer_id);
lra(r3, r0, mem_addr);
// set the length
li64(r4, length);
// ecall
eca();
}
private fn log(log_level, string){
let str = data::str(string);
ipc_send(1, str, str.len);
}
fn Error(string) {log(0, string);}
fn Warn(string) {log(1, string);}
fn Info(string) {log(2, string);}
// Due to rhai limitations this cannot be debug
// because of this all of the log levels are upper case
fn Debug(string) {log(3, string);}
fn Trace(string) {log(4, string);}

View file

@ -1,9 +0,0 @@
let hello = data::str("Hello, world!");
li8 (r1, 1); // Write syscall
li8 (r2, 1); // Stdout FD
lra16 (r3, r0, hello); // String buffer
li8 (r4, hello.len); // String length
eca (); // System call
tx (); // End program

View file

@ -1,33 +0,0 @@
li8(r1, 69);
li8(r2, 0);
if_eq(r1, r2,
|| puts("Equals!"),
|| puts("Not equals!"),
);
tx(); // END OF MAIN
/// Inline function write text to stdout
fn puts(string) {
let d = data::str(string);
li8 (r1, 1); // Write syscall
li8 (r2, 1); // Stdout handle
lra16 (r3, r0, d);
li64 (r4, d.len);
eca ();
}
fn if_eq(a, b, thenblk, elseblk) {
let elselbl = declabel();
let endlbl = declabel();
jne(a, b, elselbl);
thenblk.call();
jmp16(endlbl);
elselbl.here();
elseblk.call();
endlbl.here();
}

View file

@ -1,101 +0,0 @@
//! Data section inserts
use {
crate::{object::SymbolRef, SharedObject},
rhai::{CustomType, Engine, FuncRegistration, ImmutableString, Module},
};
/// Generate insertions for data types
///
/// `gen_data_instructions!($module, $obj, [$type, …]);`
/// - `$module`: Rhai module
/// - `$obj`: Code object
/// - `$type`: Type of single array item
macro_rules! gen_data_insertions {
($module:expr, $obj:expr, [$($ty:ident),* $(,)?] $(,)?) => {{
let (module, obj) = ($module, $obj);
$({
// Clone object to each function
let obj = ::std::rc::Rc::clone(obj);
FuncRegistration::new(stringify!($ty))
.with_namespace(rhai::FnNamespace::Global)
.set_into_module::<_, 1, false, _, true, _>(module, move |arr: ::rhai::Array| {
let obj = &mut *obj.borrow_mut();
let symbol = obj.symbol($crate::object::Section::Data);
// Reserve space for object so we don't resize it
// all the time
obj.sections
.data
.reserve(arr.len() * ::std::mem::size_of::<$ty>());
// For every item…
for item in arr {
// … try do conversions from i32 to desired type
// and insert it.
obj.sections.data.extend(
match item.as_int() {
Ok(num) => $ty::try_from(num).map_err(|_| "i64".to_owned()),
Err(ty) => Err(ty.to_owned()),
}
.map_err(|err| {
::rhai::EvalAltResult::ErrorMismatchDataType(
stringify!($ty).to_owned(),
err,
::rhai::Position::NONE,
)
})?
.to_le_bytes(),
);
}
Ok(DataRef {
symbol,
len: obj.sections.data.len() - symbol.0,
})
});
})*
}};
}
/// Reference to entry in data section
#[derive(Clone, Copy, Debug)]
pub struct DataRef {
pub symbol: SymbolRef,
pub len: usize,
}
impl CustomType for DataRef {
fn build(mut builder: rhai::TypeBuilder<Self>) {
builder
.with_name("DataRef")
.with_get("symbol", |this: &mut Self| this.symbol)
.with_get("len", |this: &mut Self| this.len as u64 as i64);
}
}
pub fn module(engine: &mut Engine, obj: SharedObject) -> Module {
let mut module = Module::new();
gen_data_insertions!(&mut module, &obj, [i8, i16, i32, i64]);
// Specialisation for strings, they should be
// inserted as plain UTF-8 arrays
FuncRegistration::new("str")
.with_namespace(rhai::FnNamespace::Global)
.set_into_module::<_, 1, false, _, true, _>(&mut module, move |s: ImmutableString| {
let obj = &mut *obj.borrow_mut();
let symbol = obj.symbol(crate::object::Section::Data);
obj.sections.data.extend(s.as_bytes());
Ok(DataRef {
symbol,
len: s.len(),
})
});
engine.build_type::<DataRef>();
module
}

View file

@ -1,321 +0,0 @@
//! Functions for inserting instructions
//!
//! Most of the code you see is just metaprogramming stuff.
//! This ensures that adding new instructions won't need any
//! specific changes and consistent behaviour.
//!
//! > I tried to comment stuff here, but I meanwhile forgor how it works.
//!
//! — Erin
use {
crate::object::Object,
rhai::{FuncRegistration, Module},
std::{cell::RefCell, rc::Rc},
};
/// Operand types and their insertions
pub mod optypes {
use {
crate::{
label::UnboundLabel,
object::{Object, RelocKey, RelocType, SymbolRef},
},
rhai::{Dynamic, EvalAltResult, ImmutableString, Position},
};
// These types represent operand types to be inserted
pub type R = u8;
pub type B = i8;
pub type H = i16;
pub type W = i32;
pub type D = i64;
pub type A = Dynamic;
pub type O = Dynamic;
pub type P = Dynamic;
/// Insert relocation into code
///
/// - If integer, just write it to the code
/// - Otherwise insert entry into relocation table
/// and fill zeroes
pub fn insert_reloc(
obj: &mut Object,
ty: RelocType,
val: &Dynamic,
) -> Result<(), EvalAltResult> {
match () {
// Direct references insert directly to table
_ if val.is::<SymbolRef>() => {
obj.relocation(RelocKey::Symbol(val.clone_cast::<SymbolRef>().0), ty)
}
_ if val.is::<UnboundLabel>() => {
obj.relocation(RelocKey::Symbol(val.clone_cast::<UnboundLabel>().0), ty)
}
_ if val.is::<DataRef>() => {
obj.relocation(RelocKey::Symbol(val.clone_cast::<DataRef>().symbol.0), ty)
}
// String (indirect) reference
_ if val.is_string() => {
obj.relocation(RelocKey::Label(val.clone_cast::<ImmutableString>()), ty)
}
// Manual offset
_ if val.is_int() => {
let int = val.clone_cast::<i64>();
match ty {
RelocType::Rel32 => obj.sections.text.extend((int as i32).to_le_bytes()),
RelocType::Rel16 => obj.sections.text.extend((int as i16).to_le_bytes()),
RelocType::Abs64 => obj.sections.text.extend(int.to_le_bytes()),
}
}
_ => {
return Err(EvalAltResult::ErrorMismatchDataType(
"SymbolRef, UnboundLabel, String or Int".to_owned(),
val.type_name().to_owned(),
Position::NONE,
))
}
}
Ok(())
}
/// Generate macro for inserting item into the output object
///
/// Pre-defines inserts for absolute address and relative offsets.
/// These are inserted with function [`insert_reloc`]
/// # le_bytes
/// `gen_insert!(le_bytes: [B, …]);`
///
/// Takes sequence of operand types which should be inserted
/// by invoking `to_le_bytes` method on it.
macro_rules! gen_insert {
(le_bytes: [$($lety:ident),* $(,)?]) => {
/// `insert!($thing, $obj, $type)` where
/// - `$thing`: Value you want to insert
/// - `$obj`: Code object
/// - `$type`: Type of inserted value
///
/// Eg. `insert!(69_u8, obj, B);`
macro_rules! insert {
$(($thing:expr, $obj: expr, $lety) => {
$obj.sections.text.extend($thing.to_le_bytes());
};)*
($thing:expr, $obj:expr, A) => {
$crate::ins::optypes::insert_reloc(
$obj,
$crate::object::RelocType::Abs64,
$thing
)?
};
($thing:expr, $obj:expr, O) => {
$crate::ins::optypes::insert_reloc(
$obj,
$crate::object::RelocType::Rel32,
$thing
)?
};
($thing:expr, $obj:expr, P) => {
$crate::ins::optypes::insert_reloc(
$obj,
$crate::object::RelocType::Rel16,
$thing
)?
};
}
};
}
gen_insert!(le_bytes: [R, B, H, W, D]);
#[allow(clippy::single_component_path_imports)]
pub(super) use insert;
use crate::data::DataRef;
}
/// Rhai Types (types for function parameters as Rhai uses only 64bit signed integers)
pub mod rity {
pub use super::optypes::{A, O, P, R};
pub type B = i64;
pub type H = i64;
pub type W = i64;
pub type D = i64;
}
/// Generic instruction (instruction of certain operands type) inserts
pub mod generic {
use {crate::object::Object, rhai::EvalAltResult};
pub(super) fn convert_op<A, B>(from: A) -> Result<B, EvalAltResult>
where
B: TryFrom<A>,
<B as TryFrom<A>>::Error: std::error::Error + Sync + Send + 'static,
{
B::try_from(from).map_err(|e| {
EvalAltResult::ErrorSystem("Data conversion error".to_owned(), Box::new(e))
})
}
/// Generate opcode-generic instruction insert macro
macro_rules! gen_ins {
($($($name:ident : $ty:ty),*;)*) => {
paste::paste! {
$(
/// Instruction-generic opcode insertion function
/// - `obj`: Code object
/// - `opcode`: opcode, not checked if valid for instruction type
/// - … for operands
#[inline]
pub fn [<$($ty:lower)*>](
obj: &mut Object,
opcode: u8,
$($name: $crate::ins::optypes::$ty),*,
) -> Result<(), EvalAltResult> {
// Push opcode
obj.sections.text.push(opcode);
// Insert based on type
$($crate::ins::optypes::insert!(&$name, obj, $ty);)*
Ok(())
}
)*
/// Generate Rhai opcode-specific instruction insertion functions
///
/// `gen_ins_fn!($obj, $opcode, $optype);` where:
/// - `$obj`: Code object
/// - `$opcode`: Opcode value
macro_rules! gen_ins_fn {
$(
($obj:expr, $opcode:expr, [<$($ty)*>]) => {
// Opcode-specific insertion function
// - Parameters = operands
move |$($name: $crate::ins::rity::$ty),*| {
// Invoke generic function
$crate::ins::generic::[<$($ty:lower)*>](
&mut *$obj.borrow_mut(),
$opcode,
$(
// Convert to desired type (from Rhai-provided values)
$crate::ins::generic::convert_op::<
_,
$crate::ins::optypes::$ty
>($name)?
),*
)?;
Ok(())
}
};
// Internal-use: count args
(@arg_count [<$($ty)*>]) => {
{ ["", $(stringify!($ty)),*].len() - 1 }
};
)*
// Specialisation for no-operand instructions
($obj:expr, $opcode:expr, N) => {
move || {
$crate::ins::generic::n(&mut *$obj.borrow_mut(), $opcode);
Ok(())
}
};
// Internal-use specialisation: no-operand instructions
(@arg_count N) => {
{ 0 }
};
}
}
};
}
/// Specialisation for no-operand instructions simply just push opcode
#[inline]
pub fn n(obj: &mut Object, opcode: u8) {
obj.sections.text.push(opcode);
}
// Generate opcode-generic instruction inserters
// (operand identifiers are arbitrary)
//
// New instruction types have to be added manually here
gen_ins! {
o0: R, o1: R;
o0: R, o1: R, o2: R;
o0: R, o1: R, o2: R, o3: R;
o0: R, o1: R, o2: B;
o0: R, o1: R, o2: H;
o0: R, o1: R, o2: W;
o0: R, o1: R, o2: D;
o0: R, o1: B;
o0: R, o1: H;
o0: R, o1: W;
o0: R, o1: D;
o0: R, o1: R, o2: A;
o0: R, o1: R, o2: A, o3: H;
o0: R, o1: R, o2: O, o3: H;
o0: R, o1: R, o2: P, o3: H;
o0: R, o1: R, o2: O;
o0: R, o1: R, o2: P;
o0: O;
o0: P;
}
#[allow(clippy::single_component_path_imports)]
pub(super) use gen_ins_fn;
}
/// Generate instructions from instruction table
///
/// ```ignore
/// instructions!(($module, $obj) {
/// // Data from instruction table
/// $opcode, $mnemonic, $opty, $doc;
/// …
/// });
/// ```
/// - `$module`: Rhai module
/// - `$obj`: Code object
macro_rules! instructions {
(
($module:expr, $obj:expr $(,)?)
{ $($opcode:expr, $mnemonic:ident, $ops:tt, $doc:literal;)* }
) => {{
paste::paste! {
let (module, obj) = ($module, $obj);
$({
// Object is shared across all functions
let obj = Rc::clone(&obj);
// Register newly generated function for each instruction
FuncRegistration::new(stringify!([<$mnemonic:lower>]))
.with_namespace(rhai::FnNamespace::Global)
.set_into_module::<_, { generic::gen_ins_fn!(@arg_count $ops) }, false, _, true, _>(
module,
generic::gen_ins_fn!(
obj,
$opcode,
$ops
)
);
})*
}
}};
}
/// Setup instruction insertors
pub fn setup(module: &mut Module, obj: Rc<RefCell<Object>>) {
// Import instructions table and use it for generation
with_builtin_macros::with_builtin! {
let $spec = include_from_root!("../hbbytecode/instructions.in") in {
instructions!((module, obj) { $spec });
}
}
}

View file

@ -1,112 +0,0 @@
//! Stuff related to labels
use {
crate::SharedObject,
rhai::{Engine, FuncRegistration, ImmutableString, Module},
};
/// Macro for creating functions for Rhai which
/// is bit more friendly
///
/// ```ignore
/// shdm_fns!{
/// module: $module;
/// shared: $shared => $shname;
///
/// $vis fn $name($param_name: $param_ty, …) -> $ret { … }
/// …
/// }
/// ```
/// - `$module`: Rhai module
/// - `$shared`: Data to be shared across the functions
/// - `$shname`: The binding name inside functions
/// - `$vis`: Function visibility for Rhai
/// - Lowercased [`rhai::FnNamespace`] variants
/// - `$name`: Function name
/// - `$param_name`: Parameter name
/// - `$param_ty`: Rust parameter type
/// - `$ret`: Optional return type (otherwise infer)
macro_rules! shdm_fns {
(
module: $module:expr;
shared: $shared:expr => $shname:ident;
$(
$vis:ident fn $name:ident($($param_name:ident: $param_ty:ty),*) $(-> $ret:ty)? $blk:block
)*
) => {{
let module = $module;
let shared = $shared;
paste::paste! {
$({
let $shname = SharedObject::clone(&shared);
FuncRegistration::new(stringify!($name))
.with_namespace(rhai::FnNamespace::[<$vis:camel>])
.set_into_module::<_, { ["", $(stringify!($param_name)),*].len() - 1 }, false, _, true, _>(
module,
move |$($param_name: $param_ty),*| $(-> $ret)? {
let mut $shname = $shname.borrow_mut();
$blk
}
);
})*
}
}};
}
/// Label without any place bound
#[derive(Clone, Copy, Debug)]
pub struct UnboundLabel(pub usize);
pub fn setup(engine: &mut Engine, module: &mut Module, object: SharedObject) {
shdm_fns! {
module: module;
shared: object => obj;
// Insert unnamed label
global fn label() {
let symbol = obj.symbol(crate::object::Section::Text);
Ok(symbol)
}
// Insert string-labeled label
global fn label(label: ImmutableString) {
let symbol = obj.symbol(crate::object::Section::Text);
obj.labels.insert(label, symbol.0);
Ok(symbol)
}
// Declare unbound label (to be bound later)
global fn declabel() {
let index = obj.symbols.len();
obj.symbols.push(None);
Ok(UnboundLabel(index))
}
// Declare unbound label (to be bound later)
// with string label
global fn declabel(label: ImmutableString) {
let index = obj.symbols.len();
obj.symbols.push(None);
obj.labels.insert(label, index);
Ok(UnboundLabel(index))
}
// Set location for unbound label
global fn here(label: UnboundLabel) {
obj.symbols[label.0] = Some(crate::object::SymbolEntry {
location: crate::object::Section::Text,
offset: obj.sections.text.len(),
});
Ok(())
}
}
engine.register_type_with_name::<UnboundLabel>("UnboundLabel");
}

View file

@ -1,45 +0,0 @@
pub mod data;
pub mod ins;
pub mod label;
pub mod linker;
pub mod object;
use {
object::Object,
rhai::{Engine, Module},
std::{cell::RefCell, rc::Rc},
};
type SharedObject = Rc<RefCell<Object>>;
pub fn assembler(
linkout: &mut impl std::io::Write,
loader: impl FnOnce(&mut Engine) -> Result<(), Box<rhai::EvalAltResult>>,
) -> Result<(), Box<dyn std::error::Error>> {
let mut engine = Engine::new();
let mut module = Module::new();
let obj = Rc::new(RefCell::new(Object::default()));
ins::setup(&mut module, Rc::clone(&obj));
label::setup(&mut engine, &mut module, Rc::clone(&obj));
// Registers
for n in 0_u8..=255 {
module.set_var(format!("r{n}"), n);
}
module.set_native_fn("reg", |n: i64| {
Ok(u8::try_from(n).map_err(|_| {
rhai::EvalAltResult::ErrorRuntime("Invalid register value".into(), rhai::Position::NONE)
})?)
});
module.set_native_fn("as_i64", |n: u8| Ok(n as i64));
let datamod = Rc::new(data::module(&mut engine, SharedObject::clone(&obj)));
engine.register_global_module(Rc::new(module));
engine.register_static_module("data", datamod);
engine.register_type_with_name::<object::SymbolRef>("SymbolRef");
loader(&mut engine)?;
linker::link(obj, linkout)?;
Ok(())
}

View file

@ -1,47 +0,0 @@
//! Simple flat-bytecode linker
use {
crate::{
object::{RelocKey, RelocType, Section},
SharedObject,
},
std::io::Write,
};
pub fn link(object: SharedObject, out: &mut impl Write) -> std::io::Result<()> {
let obj = &mut *object.borrow_mut();
// Walk relocation table entries
for (&loc, entry) in &obj.relocs {
let value = match &entry.key {
// Symbol direct reference
RelocKey::Symbol(sym) => obj.symbols[*sym],
// Label indirect label reference
RelocKey::Label(label) => obj.symbols[obj.labels[label]],
}
.ok_or_else(|| std::io::Error::other("Invalid symbol"))?;
let offset = match value.location {
// Text section is on the beginning
Section::Text => value.offset,
// Data section follows text section immediately
Section::Data => value.offset + obj.sections.text.len(),
};
// Insert address or calulate relative offset
match entry.ty {
RelocType::Rel32 => obj.sections.text[loc..loc + 4]
.copy_from_slice(&((offset as isize - loc as isize) as i32).to_le_bytes()),
RelocType::Rel16 => obj.sections.text[loc..loc + 2]
.copy_from_slice(&((offset as isize - loc as isize) as i16).to_le_bytes()),
RelocType::Abs64 => obj.sections.text[loc..loc + 8]
.copy_from_slice(&(offset as isize - loc as isize).to_le_bytes()),
}
}
// Write to output
out.write_all(&obj.sections.text)?;
out.write_all(&obj.sections.data)
}

View file

@ -1,8 +0,0 @@
use std::{io::stdout, path::PathBuf};
fn main() -> Result<(), Box<dyn std::error::Error>> {
let path = PathBuf::from(std::env::args().nth(1).ok_or("Missing path")?);
hbasm::assembler(&mut stdout(), |engine| engine.run_file(path))?;
Ok(())
}

View file

@ -1,94 +0,0 @@
//! Code object
use {rhai::ImmutableString, std::collections::HashMap};
/// Section tabel
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
pub enum Section {
Text,
Data,
}
/// Symbol entry (in what section, where)
#[derive(Clone, Copy, Debug)]
pub struct SymbolEntry {
pub location: Section,
pub offset: usize,
}
/// Relocation table key
#[derive(Clone, Debug)]
pub enum RelocKey {
/// Direct reference
Symbol(usize),
/// Indirect reference
Label(ImmutableString),
}
/// Relocation type
#[derive(Clone, Copy, Debug)]
pub enum RelocType {
Rel32,
Rel16,
Abs64,
}
/// Relocation table entry
#[derive(Clone, Debug)]
pub struct RelocEntry {
pub key: RelocKey,
pub ty: RelocType,
}
/// Object code
#[derive(Clone, Debug, Default)]
pub struct Sections {
pub text: Vec<u8>,
pub data: Vec<u8>,
}
/// Object
#[derive(Clone, Debug, Default)]
pub struct Object {
/// Vectors with sections
pub sections: Sections,
/// Symbol table
pub symbols: Vec<Option<SymbolEntry>>,
/// Labels to symbols table
pub labels: HashMap<ImmutableString, usize>,
/// Relocation table
pub relocs: HashMap<usize, RelocEntry>,
}
#[derive(Clone, Copy, Debug)]
#[repr(transparent)]
pub struct SymbolRef(pub usize);
impl Object {
/// Insert symbol at current location in specified section
pub fn symbol(&mut self, section: Section) -> SymbolRef {
let section_buf = match section {
Section::Text => &mut self.sections.text,
Section::Data => &mut self.sections.data,
};
self.symbols.push(Some(SymbolEntry {
location: section,
offset: section_buf.len(),
}));
SymbolRef(self.symbols.len() - 1)
}
/// Insert to relocation table and write zeroes to code
pub fn relocation(&mut self, key: RelocKey, ty: RelocType) {
self.relocs
.insert(self.sections.text.len(), RelocEntry { key, ty });
self.sections.text.extend(match ty {
RelocType::Rel32 => &[0_u8; 4] as &[u8],
RelocType::Rel16 => &[0; 2],
RelocType::Abs64 => &[0; 8],
});
}
}

View file

@ -1,185 +0,0 @@
#![no_std]
use core::convert::TryFrom;
type OpR = u8;
type OpA = u64;
type OpO = i32;
type OpP = i16;
type OpB = u8;
type OpH = u16;
type OpW = u32;
type OpD = u64;
/// # Safety
/// Has to be valid to be decoded from bytecode.
pub unsafe trait BytecodeItem {}
macro_rules! define_items {
($($name:ident ($($nm:ident: $item:ident),* $(,)?)),* $(,)?) => {
$(
#[derive(Clone, Copy, Debug)]
#[repr(packed)]
pub struct $name($(pub $item),*);
unsafe impl BytecodeItem for $name {}
impl Encodable for $name {
fn encode(self, _buffer: &mut impl Buffer) {
let Self($($nm),*) = self;
$(
for byte in $nm.to_le_bytes() {
unsafe { _buffer.write(byte) };
}
)*
}
fn encode_len(self) -> usize {
core::mem::size_of::<Self>()
}
}
)*
};
}
define_items! {
OpsRR (a: OpR, b: OpR ),
OpsRRR (a: OpR, b: OpR, c: OpR ),
OpsRRRR (a: OpR, b: OpR, c: OpR, d: OpR),
OpsRRB (a: OpR, b: OpR, c: OpB ),
OpsRRH (a: OpR, b: OpR, c: OpH ),
OpsRRW (a: OpR, b: OpR, c: OpW ),
OpsRRD (a: OpR, b: OpR, c: OpD ),
OpsRB (a: OpR, b: OpB ),
OpsRH (a: OpR, b: OpH ),
OpsRW (a: OpR, b: OpW ),
OpsRD (a: OpR, b: OpD ),
OpsRRA (a: OpR, b: OpR, c: OpA ),
OpsRRAH (a: OpR, b: OpR, c: OpA, d: OpH),
OpsRROH (a: OpR, b: OpR, c: OpO, d: OpH),
OpsRRPH (a: OpR, b: OpR, c: OpP, d: OpH),
OpsRRO (a: OpR, b: OpR, c: OpO ),
OpsRRP (a: OpR, b: OpR, c: OpP ),
OpsO (a: OpO, ),
OpsP (a: OpP, ),
OpsN ( ),
}
unsafe impl BytecodeItem for u8 {}
::with_builtin_macros::with_builtin! {
let $spec = include_from_root!("instructions.in") in {
/// Invoke macro with bytecode definition
///
/// # Format
/// ```text
/// Opcode, Mnemonic, Type, Docstring;
/// ```
///
/// # Type
/// ```text
/// Types consist of letters meaning a single field
/// | Type | Size (B) | Meaning |
/// |:-----|:---------|:------------------------|
/// | N | 0 | Empty |
/// | R | 1 | Register |
/// | A | 8 | Absolute address |
/// | O | 4 | Relative address offset |
/// | P | 2 | Relative address offset |
/// | B | 1 | Immediate |
/// | H | 2 | Immediate |
/// | W | 4 | Immediate |
/// | D | 8 | Immediate |
/// ```
#[macro_export]
macro_rules! invoke_with_def {
($($macro:tt)*) => {
$($macro)*! { $spec }
};
}
}
}
pub trait Buffer {
fn reserve(&mut self, bytes: usize);
/// # Safety
/// Reserve needs to be called before this function, and only reserved amount can be written.
unsafe fn write(&mut self, byte: u8);
}
pub trait Encodable {
fn encode(self, buffer: &mut impl Buffer);
fn encode_len(self) -> usize;
}
macro_rules! gen_opcodes {
($($opcode:expr, $mnemonic:ident, $ty:ident, $doc:literal;)*) => {
pub mod opcode {
$(
#[doc = $doc]
pub const $mnemonic: u8 = $opcode;
)*
paste::paste! {
#[derive(Clone, Copy, Debug)]
#[repr(u8)]
pub enum Op { $(
[< $mnemonic:lower:camel >](super::[<Ops $ty>]),
)* }
impl Op {
pub fn size(&self) -> usize {
(match self {
$(Self::[<$mnemonic:lower:camel>] { .. } => core::mem::size_of::<super::[<Ops $ty>]>(),)*
}) + 1
}
}
impl crate::Encodable for Op {
fn encode(self, buffer: &mut impl crate::Buffer) {
match self {
$(
Self::[< $mnemonic:lower:camel >](op) => {
unsafe { buffer.write($opcode) };
op.encode(buffer);
}
)*
}
}
fn encode_len(self) -> usize {
match self {
$(
Self::[< $mnemonic:lower:camel >](op) => {
1 + crate::Encodable::encode_len(op)
}
)*
}
}
}
}
}
};
}
/// Rounding mode
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
#[repr(u8)]
pub enum RoundingMode {
NearestEven = 0,
Truncate = 1,
Up = 2,
Down = 3,
}
impl TryFrom<u8> for RoundingMode {
type Error = ();
fn try_from(value: u8) -> Result<Self, Self::Error> {
(value <= 3)
.then(|| unsafe { core::mem::transmute(value) })
.ok_or(())
}
}
invoke_with_def!(gen_opcodes);

View file

@ -1,47 +0,0 @@
use std::alloc::Layout;
use hbvm::mem::{Address, LoadError, Memory, StoreError};
pub struct HostMemory;
impl Memory for HostMemory {
#[inline]
unsafe fn load(
&mut self,
addr: Address,
target: *mut u8,
count: usize,
) -> Result<(), LoadError> {
unsafe { core::ptr::copy(addr.get() as *const u8, target, count) }
Ok(())
}
#[inline]
unsafe fn store(
&mut self,
addr: Address,
source: *const u8,
count: usize,
) -> Result<(), StoreError> {
unsafe { core::ptr::copy(source, addr.get() as *mut u8, count) }
Ok(())
}
#[inline]
unsafe fn prog_read<T: Copy>(&mut self, addr: Address) -> T {
unsafe { core::ptr::read(addr.get() as *const T) }
}
}
const STACK_SIZE: usize = 2; // MiB
type Stack = [u8; 1024 * 1024 * STACK_SIZE];
/// Allocate stack of size [`STACK_SIZE`] MiB
pub unsafe fn alloc_stack() -> Box<Stack> {
let layout = Layout::new::<Stack>();
let ptr = unsafe { std::alloc::alloc(layout) };
if ptr.is_null() {
std::alloc::handle_alloc_error(layout);
}
unsafe { Box::from_raw(ptr.cast()) }
}

23
lang/Cargo.toml Normal file
View file

@ -0,0 +1,23 @@
[package]
name = "hblang"
version = "0.1.0"
edition = "2021"
[[bin]]
name = "hbc"
path = "src/main.rs"
[[bin]]
name = "fuzz"
path = "src/fuzz_main.rs"
[dependencies]
hbbytecode = { workspace = true, features = ["disasm"] }
hbvm = { workspace = true, features = ["nightly"] }
hashbrown = { version = "0.15.0", default-features = false, features = ["raw-entry"] }
log = "0.4.22"
[features]
default = ["std"]
std = []
no_log = ["log/max_level_off"]

1871
lang/README.md Normal file

File diff suppressed because one or more lines are too long

4
lang/command-help.txt Normal file
View file

@ -0,0 +1,4 @@
--fmt - format all imported source files
--fmt-stdout - dont write the formatted file but print it
--dump-asm - output assembly instead of raw code, (the assembly is more for debugging the compiler)
--threads <1...> - number of extra threads compiler can use [default: 0]

635
lang/src/fmt.rs Normal file
View file

@ -0,0 +1,635 @@
use {
crate::{
lexer::{self, Lexer, TokenKind},
parser::{self, CommentOr, CtorField, EnumField, Expr, Poser, Radix, StructField},
},
core::{
fmt::{self},
mem,
},
};
pub fn display_radix(radix: Radix, mut value: u64, buf: &mut [u8; 64]) -> &str {
fn conv_radix(d: u8) -> u8 {
match d {
0..=9 => d + b'0',
_ => d - 10 + b'A',
}
}
for (i, b) in buf.iter_mut().enumerate().rev() {
let d = (value % radix as u64) as u8;
value /= radix as u64;
*b = conv_radix(d);
if value == 0 {
return unsafe { core::str::from_utf8_unchecked(&buf[i..]) };
}
}
unreachable!()
}
#[repr(u8)]
enum TokenGroup {
Blank,
Comment,
Keyword,
Identifier,
Directive,
Number,
String,
Op,
Assign,
Paren,
Bracket,
Colon,
Comma,
Dot,
Ctor,
}
fn token_group(kind: TokenKind) -> TokenGroup {
use {crate::lexer::TokenKind::*, TokenGroup as TG};
match kind {
BSlash | Pound | Eof | Ct => TG::Blank,
Comment => TG::Comment,
Directive => TG::Directive,
Colon => TG::Colon,
Semi | Comma => TG::Comma,
Dot => TG::Dot,
Ctor | Tupl | TArrow => TG::Ctor,
LParen | RParen => TG::Paren,
LBrace | RBrace | LBrack | RBrack => TG::Bracket,
Number | Float => TG::Number,
Under | CtIdent | Ident => TG::Identifier,
Tick | Tilde | Que | Not | Mod | Band | Bor | Xor | Mul | Add | Sub | Div | Shl | Shr
| Or | And | Lt | Gt | Eq | Le | Ge | Ne => TG::Op,
Decl | Assign | BorAss | XorAss | BandAss | AddAss | SubAss | MulAss | DivAss | ModAss
| ShrAss | ShlAss => TG::Assign,
DQuote | Quote => TG::String,
Slf | Defer | Return | If | Else | Loop | Break | Continue | Fn | Idk | Die | Struct
| Packed | True | False | Null | Match | Enum => TG::Keyword,
}
}
pub fn get_token_kinds(mut source: &mut [u8]) -> usize {
let len = source.len();
loop {
let src = unsafe { core::str::from_utf8_unchecked(source) };
let mut token = lexer::Lexer::new(src).eat();
match token.kind {
TokenKind::Eof => break,
// ???
TokenKind::CtIdent | TokenKind::Directive => token.start -= 1,
_ => {}
}
let start = token.start as usize;
let end = token.end as usize;
source[..start].fill(0);
source[start..end].fill(token_group(token.kind) as u8);
source = &mut source[end..];
}
len
}
pub fn minify(source: &mut str) -> usize {
fn needs_space(c: u8) -> bool {
matches!(c, b'a'..=b'z' | b'A'..=b'Z' | b'0'..=b'9' | 127..)
}
let mut writer = source.as_mut_ptr();
let mut reader = &source[..];
let mut prev_needs_whitecpace = false;
let mut prev_needs_newline = false;
loop {
let mut token = lexer::Lexer::new(reader).eat();
match token.kind {
TokenKind::Eof => break,
TokenKind::CtIdent | TokenKind::Directive => token.start -= 1,
_ => {}
}
let cpy_len = token.range().len();
let mut prefix = 0;
if prev_needs_whitecpace && needs_space(reader.as_bytes()[token.start as usize]) {
prefix = b' ';
debug_assert!(token.start != 0, "{reader}");
}
prev_needs_whitecpace = needs_space(reader.as_bytes()[token.end as usize - 1]);
let inbetween_new_lines =
reader[..token.start as usize].bytes().filter(|&b| b == b'\n').count()
+ token.kind.precedence().is_some() as usize;
let extra_prefix_new_lines = if inbetween_new_lines > 1 {
1 + token.kind.precedence().is_none() as usize
} else {
prev_needs_newline as usize
};
if token.kind == TokenKind::Comment && reader.as_bytes()[token.end as usize - 1] != b'/' {
prev_needs_newline = true;
prev_needs_whitecpace = false;
} else {
prev_needs_newline = false;
}
let sstr = reader[token.start as usize..].as_ptr();
reader = &reader[token.end as usize..];
unsafe {
if extra_prefix_new_lines != 0 {
for _ in 0..extra_prefix_new_lines {
writer.write(b'\n');
writer = writer.add(1);
}
} else if prefix != 0 {
writer.write(prefix);
writer = writer.add(1);
}
writer.copy_from(sstr, cpy_len);
writer = writer.add(cpy_len);
}
}
unsafe { writer.sub_ptr(source.as_mut_ptr()) }
}
pub struct Formatter<'a> {
source: &'a str,
depth: usize,
}
// we exclusively use `write_str` to reduce bloat
impl<'a> Formatter<'a> {
pub fn new(source: &'a str) -> Self {
Self { source, depth: 0 }
}
fn fmt_list<T: Poser, F: core::fmt::Write>(
&mut self,
f: &mut F,
trailing: bool,
end: &str,
sep: &str,
list: &[T],
fmt: impl Fn(&mut Self, &T, &mut F) -> fmt::Result,
) -> fmt::Result {
self.fmt_list_low(f, trailing, end, sep, list, |s, v, f| {
fmt(s, v, f)?;
Ok(true)
})
}
fn fmt_list_low<T: Poser, F: core::fmt::Write>(
&mut self,
f: &mut F,
trailing: bool,
end: &str,
sep: &str,
list: &[T],
fmt: impl Fn(&mut Self, &T, &mut F) -> Result<bool, fmt::Error>,
) -> fmt::Result {
if !trailing {
let mut first = true;
for expr in list {
if !core::mem::take(&mut first) {
f.write_str(sep)?;
f.write_str(" ")?;
}
first = !fmt(self, expr, f)?;
}
return f.write_str(end);
}
if !end.is_empty() {
writeln!(f)?;
}
self.depth += !end.is_empty() as usize;
let mut already_indented = end.is_empty();
let res = (|| {
for (i, stmt) in list.iter().enumerate() {
if !mem::take(&mut already_indented) {
for _ in 0..self.depth {
f.write_str("\t")?;
}
}
let add_sep = fmt(self, stmt, f)?;
if add_sep {
f.write_str(sep)?;
}
if let Some(expr) = list.get(i + 1)
&& let Some(rest) = self.source.get(expr.posi() as usize..)
{
if sep.is_empty() && insert_needed_semicolon(rest) {
f.write_str(";")?;
}
if preserve_newlines(&self.source[..expr.posi() as usize]) > 1 {
f.write_str("\n")?;
}
}
if add_sep {
f.write_str("\n")?;
}
}
Ok(())
})();
self.depth -= !end.is_empty() as usize;
if !end.is_empty() {
for _ in 0..self.depth {
f.write_str("\t")?;
}
f.write_str(end)?;
}
res
}
fn fmt_paren<F: core::fmt::Write>(
&mut self,
expr: &Expr,
f: &mut F,
cond: impl FnOnce(&Expr) -> bool,
) -> fmt::Result {
if cond(expr) {
f.write_str("(")?;
self.fmt(expr, f)?;
f.write_str(")")
} else {
self.fmt(expr, f)
}
}
pub fn fmt<F: core::fmt::Write>(&mut self, expr: &Expr, f: &mut F) -> fmt::Result {
macro_rules! impl_parenter {
($($name:ident => $pat:pat,)*) => {
$(
let $name = |e: &Expr| matches!(e, $pat);
)*
};
}
impl_parenter! {
unary => Expr::BinOp { .. },
postfix => Expr::UnOp { .. } | Expr::BinOp { .. },
consecutive => Expr::UnOp { .. },
}
match *expr {
Expr::Ct { value, .. } => {
f.write_str("$: ")?;
self.fmt(value, f)
}
Expr::Defer { value, .. } => {
f.write_str("defer ")?;
self.fmt(value, f)
}
Expr::Slf { .. } => f.write_str("Self"),
Expr::String { literal, .. } => f.write_str(literal),
Expr::Comment { literal, .. } => f.write_str(literal),
Expr::Mod { path, .. } => write!(f, "@use(\"{path}\")"),
Expr::Embed { path, .. } => write!(f, "@embed(\"{path}\")"),
Expr::Field { target, name: field, .. } => {
self.fmt_paren(target, f, postfix)?;
f.write_str(".")?;
f.write_str(field)
}
Expr::Directive { name, args, .. } => {
f.write_str("@")?;
f.write_str(name)?;
f.write_str("(")?;
self.fmt_list(f, false, ")", ",", args, Self::fmt)
}
Expr::Struct { fields, trailing_comma, packed, .. } => {
if packed {
f.write_str("packed ")?;
}
f.write_str("struct {")?;
self.fmt_list_low(f, trailing_comma, "}", ",", fields, |s, field, f| {
match field {
CommentOr::Or(Ok(StructField { name, ty, .. })) => {
f.write_str(name)?;
f.write_str(": ")?;
s.fmt(ty, f)?
}
CommentOr::Or(Err(scope)) => {
s.fmt_list(f, true, "", "", scope, Self::fmt)?;
return Ok(false);
}
CommentOr::Comment { literal, .. } => {
f.write_str(literal)?;
f.write_str("\n")?;
}
}
Ok(field.or().is_some())
})
}
Expr::Enum { variants, trailing_comma, .. } => {
f.write_str("enum {")?;
self.fmt_list_low(f, trailing_comma, "}", ",", variants, |_, var, f| {
match var {
CommentOr::Or(EnumField { name, .. }) => {
f.write_str(name)?;
}
CommentOr::Comment { literal, .. } => {
f.write_str(literal)?;
f.write_str("\n")?;
}
}
Ok(var.or().is_some())
})
}
Expr::Ctor { ty, fields, trailing_comma, .. } => {
if let Some(ty) = ty {
self.fmt_paren(ty, f, unary)?;
}
f.write_str(".{")?;
self.fmt_list(
f,
trailing_comma,
"}",
",",
fields,
|s: &mut Self, CtorField { name, value, .. }: &_, f| {
f.write_str(name)?;
if !matches!(value, &Expr::Ident { id, .. } if *name == &self.source[id.range()]) {
f.write_str(": ")?;
s.fmt(value, f)?;
}
Ok(())
},
)
}
Expr::Tupl {
pos,
ty: Some(&Expr::Slice { pos: spos, size: Some(&Expr::Number { value, .. }), item }),
fields,
trailing_comma,
} if value as usize == fields.len() => self.fmt(
&Expr::Tupl {
pos,
ty: Some(&Expr::Slice { pos: spos, size: None, item }),
fields,
trailing_comma,
},
f,
),
Expr::Tupl { ty, fields, trailing_comma, .. } => {
if let Some(ty) = ty {
self.fmt_paren(ty, f, unary)?;
}
f.write_str(".(")?;
self.fmt_list(f, trailing_comma, ")", ",", fields, Self::fmt)
}
Expr::Slice { item, size, .. } => {
f.write_str("[")?;
self.fmt(item, f)?;
if let Some(size) = size {
f.write_str("; ")?;
self.fmt(size, f)?;
}
f.write_str("]")
}
Expr::Index { base, index } => {
self.fmt(base, f)?;
f.write_str("[")?;
self.fmt(index, f)?;
f.write_str("]")
}
Expr::UnOp { op, val, .. } => {
f.write_str(op.name())?;
self.fmt_paren(val, f, unary)
}
Expr::Break { .. } => f.write_str("break"),
Expr::Continue { .. } => f.write_str("continue"),
Expr::If { cond, then, else_, .. } => {
f.write_str("if ")?;
self.fmt(cond, f)?;
f.write_str(" ")?;
self.fmt_paren(then, f, consecutive)?;
if let Some(e) = else_ {
f.write_str(" else ")?;
self.fmt(e, f)?;
}
Ok(())
}
Expr::Match { value, branches, .. } => {
f.write_str("match ")?;
self.fmt(value, f)?;
f.write_str(" {")?;
self.fmt_list(f, true, "}", ",", branches, |s, br, f| {
s.fmt(&br.pat, f)?;
f.write_str(" => ")?;
s.fmt(&br.body, f)
})
}
Expr::Loop { body, .. } => {
f.write_str("loop ")?;
self.fmt(body, f)
}
Expr::Closure { ret, body, args, .. } => {
f.write_str("fn(")?;
self.fmt_list(f, false, "", ",", args, |s, arg, f| {
if arg.is_ct {
f.write_str("$")?;
}
f.write_str(arg.name)?;
f.write_str(": ")?;
s.fmt(&arg.ty, f)
})?;
f.write_str("): ")?;
self.fmt(ret, f)?;
f.write_str(" ")?;
self.fmt_paren(body, f, consecutive)?;
Ok(())
}
Expr::Call { func, args, trailing_comma } => {
self.fmt_paren(func, f, postfix)?;
f.write_str("(")?;
self.fmt_list(f, trailing_comma, ")", ",", args, Self::fmt)
}
Expr::Return { val: Some(val), .. } => {
f.write_str("return ")?;
self.fmt(val, f)
}
Expr::Return { val: None, .. } => f.write_str("return"),
Expr::Wildcard { .. } => f.write_str("_"),
Expr::Ident { pos, is_ct, .. } => {
if is_ct {
f.write_str("$")?;
}
f.write_str(&self.source[Lexer::restore(self.source, pos).eat().range()])
}
Expr::Block { stmts, .. } => {
f.write_str("{")?;
self.fmt_list(f, true, "}", "", stmts, Self::fmt)
}
Expr::Number { value, radix, .. } => {
f.write_str(match radix {
Radix::Decimal => "",
Radix::Hex => "0x",
Radix::Octal => "0o",
Radix::Binary => "0b",
})?;
let mut buf = [0u8; 64];
f.write_str(display_radix(radix, value as u64, &mut buf))
}
Expr::Float { pos, .. } => {
f.write_str(&self.source[Lexer::restore(self.source, pos).eat().range()])
}
Expr::Bool { value, .. } => f.write_str(if value { "true" } else { "false" }),
Expr::Idk { .. } => f.write_str("idk"),
Expr::Die { .. } => f.write_str("die"),
Expr::Null { .. } => f.write_str("null"),
Expr::BinOp {
left,
op: TokenKind::Assign,
right: &Expr::BinOp { left: lleft, op, right, .. },
..
} if left.pos() == lleft.pos() => {
self.fmt(left, f)?;
f.write_str(" ")?;
f.write_str(op.name())?;
f.write_str("= ")?;
self.fmt(right, f)
}
Expr::BinOp { right, op, left, .. } => {
let prec_miss_left = |e: &Expr| {
matches!(
e, Expr::BinOp { op: lop, .. } if op.precedence() > lop.precedence()
)
};
let prec_miss_right = |e: &Expr| {
matches!(
e, Expr::BinOp { op: lop, .. }
if (op.precedence() == lop.precedence() && !op.is_comutative())
|| op.precedence() > lop.precedence()
)
};
self.fmt_paren(left, f, prec_miss_left)?;
if let Some(mut prev) = self.source.get(..right.pos() as usize) {
prev = prev.trim_end();
let estimate_bound =
prev.rfind(|c: char| c.is_ascii_whitespace()).map_or(prev.len(), |i| i + 1);
let exact_bound = lexer::Lexer::new(&prev[estimate_bound..]).last().start;
prev = &prev[..exact_bound as usize + estimate_bound];
if preserve_newlines(prev) > 0 {
f.write_str("\n")?;
for _ in 0..self.depth + 1 {
f.write_str("\t")?;
}
f.write_str(op.name())?;
f.write_str(" ")?;
} else {
f.write_str(" ")?;
f.write_str(op.name())?;
f.write_str(" ")?;
}
} else {
f.write_str(" ")?;
f.write_str(op.name())?;
f.write_str(" ")?;
}
self.fmt_paren(right, f, prec_miss_right)
}
}
}
}
pub fn preserve_newlines(source: &str) -> usize {
source[source.trim_end().len()..].bytes().filter(|&c| c == b'\n').count()
}
pub fn insert_needed_semicolon(source: &str) -> bool {
let kind = lexer::Lexer::new(source).eat().kind;
kind.precedence().is_some() || matches!(kind, TokenKind::Ctor | TokenKind::Tupl)
}
impl core::fmt::Display for parser::Ast {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
fmt_file(self.exprs(), &self.file, f)
}
}
pub fn fmt_file(exprs: &[Expr], file: &str, f: &mut impl fmt::Write) -> fmt::Result {
for (i, expr) in exprs.iter().enumerate() {
Formatter::new(file).fmt(expr, f)?;
if let Some(expr) = exprs.get(i + 1)
&& let Some(rest) = file.get(expr.pos() as usize..)
{
if insert_needed_semicolon(rest) {
write!(f, ";")?;
}
if preserve_newlines(&file[..expr.pos() as usize]) > 1 {
writeln!(f)?;
}
}
if i + 1 != exprs.len() {
writeln!(f)?;
}
}
Ok(())
}
#[cfg(test)]
pub mod test {
use {
crate::parser::{self, Ctx},
alloc::borrow::ToOwned,
std::{fmt::Write, string::String},
};
pub fn format(ident: &str, input: &str) {
let mut minned = input.to_owned();
let len = crate::fmt::minify(&mut minned);
minned.truncate(len);
let mut ctx = Ctx::default();
let ast = parser::Ast::new(ident, minned, &mut ctx, &mut parser::no_loader);
log::info!("{}", ctx.errors.borrow());
let mut output = String::new();
write!(output, "{ast}").unwrap();
let input_path = format!("formatter_{ident}.expected");
let output_path = format!("formatter_{ident}.actual");
std::fs::write(&input_path, input).unwrap();
std::fs::write(&output_path, output).unwrap();
let success = std::process::Command::new("diff")
.arg("-u")
.arg("--color")
.arg(&input_path)
.arg(&output_path)
.status()
.unwrap()
.success();
std::fs::remove_file(&input_path).unwrap();
std::fs::remove_file(&output_path).unwrap();
assert!(success, "test failed");
}
macro_rules! test {
($($name:ident => $input:expr;)*) => {$(
#[test]
fn $name() {
format(stringify!($name), $input);
}
)*};
}
test! {
comments => "// comment\n// comment\n\n// comment\n\n\
/* comment */\n/* comment */\n\n/* comment */";
some_ordinary_code => "loft := fn(): int return loft(1, 2, 3)";
some_arg_per_line_code => "loft := fn(): int return loft(\
\n\t1,\n\t2,\n\t3,\n)";
some_ordinary_struct => "loft := fn(): int return loft.{a: 1, b: 2}";
some_ordinary_fild_per_lin_struct => "loft := fn(): int return loft.{\
\n\ta: 1,\n\tb: 2,\n}";
code_block => "loft := fn(): int {\n\tloft()\n\treturn 1\n}";
}
}

384
lang/src/fs.rs Normal file
View file

@ -0,0 +1,384 @@
use {
crate::{
parser::{Ast, Ctx, FileKind},
son::{self, hbvm::HbvmBackend},
ty, FnvBuildHasher,
},
alloc::{string::String, vec::Vec},
core::{fmt::Write, num::NonZeroUsize, ops::Deref},
hashbrown::hash_map,
std::{
collections::VecDeque,
eprintln,
ffi::OsStr,
io::{self, Write as _},
path::{Path, PathBuf},
string::ToString,
sync::Mutex,
},
};
type HashMap<K, V> = hashbrown::HashMap<K, V, FnvBuildHasher>;
pub struct Logger;
impl log::Log for Logger {
fn enabled(&self, _: &log::Metadata) -> bool {
true
}
fn log(&self, record: &log::Record) {
if self.enabled(record.metadata()) {
eprintln!("{}", record.args())
}
}
fn flush(&self) {}
}
#[derive(Default)]
pub struct Options {
pub fmt: bool,
pub fmt_stdout: bool,
pub dump_asm: bool,
pub in_house_regalloc: bool,
pub extra_threads: usize,
}
impl Options {
pub fn from_args(args: &[&str], out: &mut Vec<u8>) -> std::io::Result<Self> {
if args.contains(&"--help") || args.contains(&"-h") {
writeln!(out, "Usage: hbc [OPTIONS...] <FILE>")?;
writeln!(out, include_str!("../command-help.txt"))?;
return Err(std::io::ErrorKind::Other.into());
}
Ok(Options {
fmt: args.contains(&"--fmt"),
fmt_stdout: args.contains(&"--fmt-stdout"),
dump_asm: args.contains(&"--dump-asm"),
in_house_regalloc: args.contains(&"--in-house-regalloc"),
extra_threads: args
.iter()
.position(|&a| a == "--threads")
.map(|i| {
args[i + 1].parse::<NonZeroUsize>().map_err(|e| {
writeln!(out, "--threads expects non zero integer: {e}")
.err()
.unwrap_or(std::io::ErrorKind::Other.into())
})
})
.transpose()?
.map_or(1, NonZeroUsize::get)
- 1,
})
}
}
pub fn run_compiler(
root_file: &str,
options: Options,
out: &mut Vec<u8>,
warnings: &mut String,
) -> std::io::Result<()> {
let parsed = parse_from_fs(options.extra_threads, root_file)?;
if (options.fmt || options.fmt_stdout) && !parsed.errors.is_empty() {
*out = parsed.errors.into_bytes();
return Err(std::io::Error::other("fmt fialed (errors are in out)"));
}
if options.fmt {
let mut output = String::new();
for ast in parsed.ast {
write!(output, "{ast}").unwrap();
if ast.file.deref().trim() != output.as_str().trim() {
std::fs::write(&*ast.path, &output)?;
}
output.clear();
}
} else if options.fmt_stdout {
write!(out, "{}", &parsed.ast[0])?;
} else {
let mut backend = HbvmBackend::default();
backend.use_in_house_regalloc = options.in_house_regalloc;
let mut ctx = crate::son::CodegenCtx::default();
*ctx.parser.errors.get_mut() = parsed.errors;
let mut codegen = son::Codegen::new(&mut backend, &parsed.ast, &mut ctx);
codegen.push_embeds(parsed.embeds);
codegen.generate(ty::Module::MAIN);
*warnings = core::mem::take(&mut *codegen.warnings.borrow_mut());
if !codegen.errors.borrow().is_empty() {
drop(codegen);
*out = ctx.parser.errors.into_inner().into_bytes();
return Err(std::io::Error::other("compilation faoled (errors are in out)"));
}
codegen.assemble(out);
if options.dump_asm {
let mut disasm = String::new();
codegen.disasm(&mut disasm, out).map_err(|e| io::Error::other(e.to_string()))?;
*out = disasm.into_bytes();
}
}
Ok(())
}
struct TaskQueue<T> {
inner: Mutex<TaskQueueInner<T>>,
}
impl<T> TaskQueue<T> {
fn new(max_waiters: usize) -> Self {
Self { inner: Mutex::new(TaskQueueInner::new(max_waiters)) }
}
pub fn push(&self, message: T) {
self.extend([message]);
}
pub fn extend(&self, messages: impl IntoIterator<Item = T>) {
self.inner.lock().unwrap().push(messages);
}
pub fn pop(&self) -> Option<T> {
TaskQueueInner::pop(&self.inner)
}
}
enum TaskSlot<T> {
Waiting,
Delivered(T),
Closed,
}
struct TaskQueueInner<T> {
max_waiters: usize,
messages: VecDeque<T>,
parked: VecDeque<(*mut TaskSlot<T>, std::thread::Thread)>,
}
unsafe impl<T: Send> Send for TaskQueueInner<T> {}
unsafe impl<T: Send + Sync> Sync for TaskQueueInner<T> {}
impl<T> TaskQueueInner<T> {
fn new(max_waiters: usize) -> Self {
Self { max_waiters, messages: Default::default(), parked: Default::default() }
}
fn push(&mut self, messages: impl IntoIterator<Item = T>) {
for msg in messages {
if let Some((dest, thread)) = self.parked.pop_front() {
unsafe { *dest = TaskSlot::Delivered(msg) };
thread.unpark();
} else {
self.messages.push_back(msg);
}
}
}
fn pop(s: &Mutex<Self>) -> Option<T> {
let mut res = TaskSlot::Waiting;
{
let mut s = s.lock().unwrap();
if let Some(msg) = s.messages.pop_front() {
return Some(msg);
}
if s.max_waiters == s.parked.len() + 1 {
for (dest, thread) in s.parked.drain(..) {
unsafe { *dest = TaskSlot::Closed };
thread.unpark();
}
return None;
}
s.parked.push_back((&mut res, std::thread::current()));
}
loop {
std::thread::park();
let _s = s.lock().unwrap();
match core::mem::replace(&mut res, TaskSlot::Waiting) {
TaskSlot::Delivered(msg) => return Some(msg),
TaskSlot::Closed => return None,
TaskSlot::Waiting => {}
}
}
}
}
pub struct Loaded {
ast: Vec<Ast>,
embeds: Vec<Vec<u8>>,
errors: String,
}
pub fn parse_from_fs(extra_threads: usize, root: &str) -> io::Result<Loaded> {
fn resolve(path: &str, from: &str, tmp: &mut PathBuf) -> Result<PathBuf, CantLoadFile> {
tmp.clear();
match Path::new(from).parent() {
Some(parent) => tmp.extend([parent, Path::new(path)]),
None => tmp.push(path),
};
tmp.canonicalize().map_err(|source| CantLoadFile { path: std::mem::take(tmp), source })
}
#[derive(Debug)]
struct CantLoadFile {
path: PathBuf,
source: io::Error,
}
impl core::fmt::Display for CantLoadFile {
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
write!(f, "can't load file: {}", display_rel_path(&self.path),)
}
}
impl core::error::Error for CantLoadFile {
fn source(&self) -> Option<&(dyn core::error::Error + 'static)> {
Some(&self.source)
}
}
impl From<CantLoadFile> for io::Error {
fn from(e: CantLoadFile) -> Self {
io::Error::new(io::ErrorKind::InvalidData, e)
}
}
type Task = (usize, PathBuf);
let seen_modules = Mutex::new(HashMap::<PathBuf, usize>::default());
let seen_embeds = Mutex::new(HashMap::<PathBuf, usize>::default());
let tasks = TaskQueue::<Task>::new(extra_threads + 1);
let ast = Mutex::new(Vec::<io::Result<Ast>>::new());
let embeds = Mutex::new(Vec::<Vec<u8>>::new());
let loader = |path: &str, from: &str, kind: FileKind, tmp: &mut _| {
let mut physiscal_path = resolve(path, from, tmp)?;
match kind {
FileKind::Module => {
let id = {
let mut seen = seen_modules.lock().unwrap();
let len = seen.len();
match seen.entry(physiscal_path) {
hash_map::Entry::Occupied(entry) => {
return Ok(*entry.get());
}
hash_map::Entry::Vacant(entry) => {
physiscal_path = entry.insert_entry(len as _).key().clone();
len
}
}
};
if !physiscal_path.exists() {
return Err(io::Error::new(
io::ErrorKind::NotFound,
format!("can't find file: {}", display_rel_path(&physiscal_path)),
));
}
tasks.push((id, physiscal_path));
Ok(id)
}
FileKind::Embed => {
let id = {
let mut seen = seen_embeds.lock().unwrap();
let len = seen.len();
match seen.entry(physiscal_path) {
hash_map::Entry::Occupied(entry) => {
return Ok(*entry.get());
}
hash_map::Entry::Vacant(entry) => {
physiscal_path = entry.insert_entry(len as _).key().clone();
len
}
}
};
let content = std::fs::read(&physiscal_path).map_err(|e| {
io::Error::new(
e.kind(),
format!(
"can't load embed file: {}: {e}",
display_rel_path(&physiscal_path)
),
)
})?;
let mut embeds = embeds.lock().unwrap();
if id as usize >= embeds.len() {
embeds.resize(id as usize + 1, Default::default());
}
embeds[id as usize] = content;
Ok(id)
}
}
};
let execute_task = |ctx: &mut _, (_, path): Task, tmp: &mut _| {
let path = path.to_str().ok_or_else(|| {
io::Error::new(
io::ErrorKind::InvalidData,
format!("path contains invalid characters: {}", display_rel_path(&path)),
)
})?;
Ok(Ast::new(path, std::fs::read_to_string(path)?, ctx, &mut |path, from, kind| {
loader(path, from, kind, tmp).map_err(|e| e.to_string())
}))
};
let thread = || {
let mut ctx = Ctx::default();
let mut tmp = PathBuf::new();
while let Some(task @ (indx, ..)) = tasks.pop() {
let res = execute_task(&mut ctx, task, &mut tmp);
let mut ast = ast.lock().unwrap();
let len = ast.len().max(indx + 1);
ast.resize_with(len, || Err(io::ErrorKind::InvalidData.into()));
ast[indx] = res;
}
ctx.errors.into_inner()
};
let path = Path::new(root).canonicalize().map_err(|e| {
io::Error::new(e.kind(), format!("can't canonicalize root file path ({root})"))
})?;
seen_modules.lock().unwrap().insert(path.clone(), 0);
tasks.push((0, path));
let errors = if extra_threads == 0 {
thread()
} else {
std::thread::scope(|s| {
(0..extra_threads + 1)
.map(|_| s.spawn(thread))
.collect::<Vec<_>>()
.into_iter()
.map(|t| t.join().unwrap())
.collect::<String>()
})
};
Ok(Loaded {
ast: ast.into_inner().unwrap().into_iter().collect::<io::Result<Vec<_>>>()?,
embeds: embeds.into_inner().unwrap(),
errors,
})
}
pub fn display_rel_path(path: &(impl AsRef<OsStr> + ?Sized)) -> std::path::Display {
static CWD: std::sync::LazyLock<PathBuf> =
std::sync::LazyLock::new(|| std::env::current_dir().unwrap_or_default());
std::path::Path::new(path).strip_prefix(&*CWD).unwrap_or(std::path::Path::new(path)).display()
}

141
lang/src/fuzz.rs Normal file
View file

@ -0,0 +1,141 @@
use {
crate::{
lexer::TokenKind,
parser,
son::{hbvm::HbvmBackend, Codegen, CodegenCtx},
ty::Module,
},
alloc::string::String,
core::{fmt::Write, hash::BuildHasher, ops::Range},
};
#[derive(Default)]
struct Rand(pub u64);
impl Rand {
pub fn next(&mut self) -> u64 {
self.0 = crate::FnvBuildHasher::default().hash_one(self.0);
self.0
}
pub fn range(&mut self, min: u64, max: u64) -> u64 {
self.next() % (max - min) + min
}
fn bool(&mut self) -> bool {
self.next() % 2 == 0
}
}
#[derive(Default)]
struct FuncGen {
rand: Rand,
buf: String,
vars: u64,
}
impl FuncGen {
fn gen(&mut self, seed: u64) -> &str {
self.rand = Rand(seed);
self.buf.clear();
self.buf.push_str("main := fn(): void ");
self.block().unwrap();
&self.buf
}
fn block(&mut self) -> core::fmt::Result {
let prev_vars = self.vars;
self.buf.push('{');
for _ in 0..self.rand.range(1, 10) {
self.stmt()?;
}
self.buf.push('}');
self.vars = prev_vars;
Ok(())
}
fn stmt(&mut self) -> core::fmt::Result {
match self.rand.range(0, 100) {
0..4 => _ = self.block(),
4..10 => {
write!(self.buf, "var{} := ", self.vars)?;
self.expr()?;
self.vars += 1;
}
10..20 if self.vars != 0 => {
write!(self.buf, "var{} = ", self.rand.range(0, self.vars))?;
self.expr()?;
}
20..23 => {
self.buf.push_str("if ");
self.expr()?;
self.block()?;
if self.rand.bool() {
self.buf.push_str(" else ");
self.block()?;
}
}
_ => {
self.buf.push_str("return ");
self.expr()?;
}
}
self.buf.push(';');
Ok(())
}
fn expr(&mut self) -> core::fmt::Result {
match self.rand.range(0, 100) {
0..80 => {
write!(self.buf, "{}", self.rand.next())
}
80..90 if self.vars != 0 => {
write!(self.buf, "var{}", self.rand.range(0, self.vars))
}
80..100 => {
self.expr()?;
let ops = [
TokenKind::Add,
TokenKind::Sub,
TokenKind::Mul,
TokenKind::Div,
TokenKind::Shl,
TokenKind::Eq,
TokenKind::Ne,
TokenKind::Lt,
TokenKind::Gt,
TokenKind::Le,
TokenKind::Ge,
TokenKind::Band,
TokenKind::Bor,
TokenKind::Xor,
TokenKind::Mod,
TokenKind::Shr,
];
let op = ops[self.rand.range(0, ops.len() as u64) as usize];
write!(self.buf, " {op} ")?;
self.expr()
}
_ => unreachable!(),
}
}
}
pub fn fuzz(seed_range: Range<u64>) {
let mut gen = FuncGen::default();
let mut ctx = CodegenCtx::default();
for i in seed_range {
ctx.clear();
let src = gen.gen(i);
let parsed = parser::Ast::new("fuzz", src, &mut ctx.parser, &mut parser::no_loader);
assert!(ctx.parser.errors.get_mut().is_empty());
let mut backend = HbvmBackend::default();
let mut cdg = Codegen::new(&mut backend, core::slice::from_ref(&parsed), &mut ctx);
cdg.generate(Module::MAIN);
}
}

3
lang/src/fuzz_main.rs Normal file
View file

@ -0,0 +1,3 @@
fn main() {
hblang::fuzz::fuzz(0..1000000);
}

575
lang/src/lexer.rs Normal file
View file

@ -0,0 +1,575 @@
const fn ascii_mask(chars: &[u8]) -> u128 {
let mut eq = 0;
let mut i = 0;
while i < chars.len() {
let b = chars[i];
eq |= 1 << b;
i += 1;
}
eq
}
#[derive(Debug, PartialEq, Eq, Clone, Copy)]
pub struct Token {
pub kind: TokenKind,
pub start: u32,
pub end: u32,
}
impl Token {
pub fn range(&self) -> core::ops::Range<usize> {
self.start as usize..self.end as usize
}
}
macro_rules! gen_token_kind {
($(
#[$atts:meta])*
$vis:vis enum $name:ident {
#[patterns] $(
$pattern:ident,
)*
#[keywords] $(
$keyword:ident = $keyword_lit:literal,
)*
#[punkt] $(
$punkt:ident = $punkt_lit:literal,
)*
#[ops] $(
#[$prec:ident] $(
$op:ident = $op_lit:literal $(=> $assign:ident)?,
)*
)*
}
) => {
impl core::fmt::Display for $name {
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
f.write_str(self.name())
}
}
impl $name {
pub const OPS: &[Self] = &[$($(Self::$op),*),*];
pub fn name(&self) -> &str {
let sf = unsafe { &*(self as *const _ as *const u8) } ;
match *self {
$( Self::$pattern => concat!('<', stringify!($pattern), '>'), )*
$( Self::$keyword => stringify!($keyword_lit), )*
$( Self::$punkt => stringify!($punkt_lit), )*
$($( Self::$op => $op_lit,
$(Self::$assign => concat!($op_lit, "="),)?)*)*
_ => unsafe { core::str::from_utf8_unchecked(core::slice::from_ref(&sf)) },
}
}
#[inline(always)]
pub fn precedence(&self) -> Option<u8> {
Some(match self {
$($(Self::$op => ${ignore($prec)} ${index(1)},
$(Self::$assign => 0,)?)*)*
_ => return None,
} + 1)
}
fn from_ident(ident: &[u8]) -> Self {
match ident {
$($keyword_lit => Self::$keyword,)*
_ => Self::Ident,
}
}
}
};
}
#[derive(PartialEq, Eq, Clone, Copy, Hash, PartialOrd, Ord)]
#[repr(u8)]
pub enum TokenKind {
Not = b'!',
DQuote = b'"',
Pound = b'#',
CtIdent = b'$',
Mod = b'%',
Band = b'&',
Quote = b'\'',
LParen = b'(',
RParen = b')',
Mul = b'*',
Add = b'+',
Comma = b',',
Sub = b'-',
Dot = b'.',
Div = b'/',
// Unused = 2-6
Shl = b'<' - 5,
// Unused = 8
Shr = b'>' - 5,
Colon = b':',
Semi = b';',
Lt = b'<',
Assign = b'=',
Gt = b'>',
Que = b'?',
Directive = b'@',
Comment,
Ident,
Number,
Float,
Eof,
Ct,
Ctor,
Tupl,
TArrow,
Or,
And,
// Unused = R-Z
LBrack = b'[',
BSlash = b'\\',
RBrack = b']',
Xor = b'^',
Under = b'_',
Tick = b'`',
Slf,
Return,
If,
Match,
Else,
Loop,
Break,
Continue,
Fn,
Struct,
Packed,
Enum,
True,
False,
Null,
Idk,
Die,
Defer,
// Unused = a-z
LBrace = b'{',
Bor = b'|',
RBrace = b'}',
Tilde = b'~',
Decl = b':' + 128,
Eq = b'=' + 128,
Ne = b'!' + 128,
Le = b'<' + 128,
Ge = b'>' + 128,
BorAss = b'|' + 128,
AddAss = b'+' + 128,
SubAss = b'-' + 128,
MulAss = b'*' + 128,
DivAss = b'/' + 128,
ModAss = b'%' + 128,
XorAss = b'^' + 128,
BandAss = b'&' + 128,
ShrAss = b'>' - 5 + 128,
ShlAss = b'<' - 5 + 128,
}
impl core::fmt::Debug for TokenKind {
fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result {
core::fmt::Display::fmt(self, f)
}
}
impl TokenKind {
pub fn ass_op(self) -> Option<Self> {
let id = (self as u8).saturating_sub(128);
if ascii_mask(b"|+-*/%^&79") & (1u128 << id) == 0 {
return None;
}
Some(unsafe { core::mem::transmute::<u8, Self>(id) })
}
pub fn is_comutative(self) -> bool {
use TokenKind as S;
matches!(self, S::Eq | S::Ne | S::Bor | S::Xor | S::Band | S::Add | S::Mul)
}
pub fn is_compatison(self) -> bool {
matches!(self, Self::Lt | Self::Gt | Self::Ge | Self::Le | Self::Ne | Self::Eq)
}
pub fn is_supported_float_op(self) -> bool {
matches!(
self,
Self::Add
| Self::Sub
| Self::Mul
| Self::Div
| Self::Eq
| Self::Ne
| Self::Le
| Self::Ge
| Self::Lt
| Self::Gt
)
}
pub fn apply_binop(self, a: i64, b: i64, float: bool) -> i64 {
if float {
debug_assert!(self.is_supported_float_op());
let [a, b] = [f64::from_bits(a as _), f64::from_bits(b as _)];
let res = match self {
Self::Add => a + b,
Self::Sub => a - b,
Self::Mul => a * b,
Self::Div => a / b,
Self::Eq => return (a == b) as i64,
Self::Ne => return (a != b) as i64,
Self::Lt => return (a < b) as i64,
Self::Gt => return (a > b) as i64,
Self::Le => return (a >= b) as i64,
Self::Ge => return (a <= b) as i64,
_ => todo!("floating point op: {self}"),
};
return res.to_bits() as _;
}
match self {
Self::Add => a.wrapping_add(b),
Self::Sub => a.wrapping_sub(b),
Self::Mul => a.wrapping_mul(b),
Self::Div if b == 0 => 0,
Self::Div => a.wrapping_div(b),
Self::Shl => a.wrapping_shl(b as _),
Self::Eq => (a == b) as i64,
Self::Ne => (a != b) as i64,
Self::Lt => (a < b) as i64,
Self::Gt => (a > b) as i64,
Self::Le => (a >= b) as i64,
Self::Ge => (a <= b) as i64,
Self::Band => a & b,
Self::Bor => a | b,
Self::Xor => a ^ b,
Self::Mod if b == 0 => 0,
Self::Mod => a.wrapping_rem(b),
Self::Shr => a.wrapping_shr(b as _),
s => todo!("{s}"),
}
}
pub fn is_homogenous(&self) -> bool {
self.precedence() != Self::Eq.precedence()
&& self.precedence() != Self::Gt.precedence()
&& self.precedence() != Self::Eof.precedence()
}
pub fn apply_unop(&self, value: i64, float: bool) -> i64 {
match self {
Self::Sub if float => (-f64::from_bits(value as _)).to_bits() as _,
Self::Sub => value.wrapping_neg(),
Self::Not => (value == 0) as _,
Self::Float if float => value,
Self::Float => (value as f64).to_bits() as _,
Self::Number if float => f64::from_bits(value as _) as _,
Self::Number => value,
s => todo!("{s}"),
}
}
pub fn closing(&self) -> Option<TokenKind> {
Some(match self {
Self::Ctor => Self::RBrace,
Self::Tupl => Self::RParen,
Self::LParen => Self::RParen,
Self::LBrack => Self::RBrack,
Self::LBrace => Self::RBrace,
_ => return None,
})
}
}
gen_token_kind! {
pub enum TokenKind {
#[patterns]
CtIdent,
Ident,
Number,
Float,
Eof,
Directive,
#[keywords]
Slf = b"Self",
Return = b"return",
If = b"if",
Match = b"match",
Else = b"else",
Loop = b"loop",
Break = b"break",
Continue = b"continue",
Fn = b"fn",
Struct = b"struct",
Packed = b"packed",
Enum = b"enum",
True = b"true",
False = b"false",
Null = b"null",
Idk = b"idk",
Die = b"die",
Defer = b"defer",
Under = b"_",
#[punkt]
Ctor = ".{",
Tupl = ".(",
TArrow = "=>",
// #define OP: each `#[prec]` delimeters a level of precedence from lowest to highest
#[ops]
#[prec]
// this also includess all `<op>=` tokens
Decl = ":=",
Assign = "=",
#[prec]
Or = "||",
#[prec]
And = "&&",
#[prec]
Bor = "|" => BorAss,
#[prec]
Xor = "^" => XorAss,
#[prec]
Band = "&" => BandAss,
#[prec]
Eq = "==",
Ne = "!=",
#[prec]
Le = "<=",
Ge = ">=",
Lt = "<",
Gt = ">",
#[prec]
Shl = "<<" => ShlAss,
Shr = ">>" => ShrAss,
#[prec]
Add = "+" => AddAss,
Sub = "-" => SubAss,
#[prec]
Mul = "*" => MulAss,
Div = "/" => DivAss,
Mod = "%" => ModAss,
}
}
pub struct Lexer<'a> {
pos: u32,
source: &'a [u8],
}
impl<'a> Lexer<'a> {
pub fn new(input: &'a str) -> Self {
Self::restore(input, 0)
}
pub fn uses(input: &'a str) -> impl Iterator<Item = &'a str> {
let mut s = Self::new(input);
core::iter::from_fn(move || loop {
let t = s.eat();
if t.kind == TokenKind::Eof {
return None;
}
if t.kind == TokenKind::Directive
&& s.slice(t.range()) == "use"
&& s.eat().kind == TokenKind::LParen
{
let t = s.eat();
if t.kind == TokenKind::DQuote {
return Some(&s.slice(t.range())[1..t.range().len() - 1]);
}
}
})
}
pub fn restore(input: &'a str, pos: u32) -> Self {
Self { pos, source: input.as_bytes() }
}
pub fn source(&self) -> &'a str {
unsafe { core::str::from_utf8_unchecked(self.source) }
}
pub fn slice(&self, tok: core::ops::Range<usize>) -> &'a str {
unsafe { core::str::from_utf8_unchecked(&self.source[tok]) }
}
pub fn taste(&self) -> Token {
Lexer { pos: self.pos, source: self.source }.eat()
}
fn peek(&self) -> Option<u8> {
if core::intrinsics::unlikely(self.pos >= self.source.len() as u32) {
None
} else {
Some(unsafe { *self.source.get_unchecked(self.pos as usize) })
}
}
fn advance(&mut self) -> Option<u8> {
let c = self.peek()?;
self.pos += 1;
Some(c)
}
pub fn last(&mut self) -> Token {
let mut token = self.eat();
loop {
let next = self.eat();
if next.kind == TokenKind::Eof {
break;
}
token = next;
}
token
}
pub fn eat(&mut self) -> Token {
use TokenKind as T;
loop {
let mut start = self.pos;
let Some(c) = self.advance() else {
return Token { kind: T::Eof, start, end: self.pos };
};
let advance_ident = |s: &mut Self| {
while let Some(b'a'..=b'z' | b'A'..=b'Z' | b'0'..=b'9' | b'_' | 127..) = s.peek() {
s.advance();
}
};
let identity = |s: u8| unsafe { core::mem::transmute::<u8, T>(s) };
let kind = match c {
..=b' ' => continue,
b'0' if self.advance_if(b'x') => {
while let Some(b'0'..=b'9' | b'A'..=b'F' | b'a'..=b'f') = self.peek() {
self.advance();
}
T::Number
}
b'0' if self.advance_if(b'b') => {
while let Some(b'0' | b'1') = self.peek() {
self.advance();
}
T::Number
}
b'0' if self.advance_if(b'o') => {
while let Some(b'0'..=b'7') = self.peek() {
self.advance();
}
T::Number
}
b'0'..=b'9' => {
while let Some(b'0'..=b'9') = self.peek() {
self.advance();
}
if self.advance_if(b'.') {
while let Some(b'0'..=b'9') = self.peek() {
self.advance();
}
T::Float
} else {
T::Number
}
}
b'a'..=b'z' | b'A'..=b'Z' | b'_' | 127.. => {
advance_ident(self);
let ident = &self.source[start as usize..self.pos as usize];
T::from_ident(ident)
}
b'"' | b'\'' => loop {
match self.advance() {
Some(b'\\') => _ = self.advance(),
Some(nc) if nc == c => break identity(c),
Some(_) => {}
None => break T::Eof,
}
},
b'/' if self.advance_if(b'/') => {
while let Some(l) = self.peek()
&& l != b'\n'
{
self.pos += 1;
}
let end = self.source[..self.pos as usize]
.iter()
.rposition(|&b| !b.is_ascii_whitespace())
.map_or(self.pos, |i| i as u32 + 1);
return Token { kind: T::Comment, start, end };
}
b'/' if self.advance_if(b'*') => {
let mut depth = 1;
while let Some(l) = self.advance() {
match l {
b'/' if self.advance_if(b'*') => depth += 1,
b'*' if self.advance_if(b'/') => match depth {
1 => break,
_ => depth -= 1,
},
_ => {}
}
}
T::Comment
}
b'.' if self.advance_if(b'{') => T::Ctor,
b'.' if self.advance_if(b'(') => T::Tupl,
b'=' if self.advance_if(b'>') => T::TArrow,
b'&' if self.advance_if(b'&') => T::And,
b'|' if self.advance_if(b'|') => T::Or,
b'$' if self.advance_if(b':') => T::Ct,
b'@' | b'$' => {
start += 1;
advance_ident(self);
identity(c)
}
b'<' | b'>' if self.advance_if(c) => {
identity(c - 5 + 128 * self.advance_if(b'=') as u8)
}
b':' | b'=' | b'!' | b'<' | b'>' | b'|' | b'+' | b'-' | b'*' | b'/' | b'%'
| b'^' | b'&'
if self.advance_if(b'=') =>
{
identity(c + 128)
}
_ => identity(c),
};
return Token { kind, start, end: self.pos };
}
}
fn advance_if(&mut self, arg: u8) -> bool {
if self.peek() == Some(arg) {
self.advance();
true
} else {
false
}
}
}
pub fn line_col(bytes: &[u8], pos: u32) -> (usize, usize) {
bytes[..pos as usize]
.split(|&b| b == b'\n')
.map(<[u8]>::len)
.enumerate()
.last()
.map(|(line, col)| (line + 1, col + 1))
.unwrap_or((1, 1))
}

1576
lang/src/lib.rs Normal file

File diff suppressed because it is too large Load diff

31
lang/src/main.rs Normal file
View file

@ -0,0 +1,31 @@
#[cfg(feature = "std")]
fn main() {
use std::io::Write;
fn run(out: &mut Vec<u8>, warnings: &mut String) -> std::io::Result<()> {
let args = std::env::args().collect::<Vec<_>>();
let args = args.iter().map(String::as_str).collect::<Vec<_>>();
let opts = hblang::Options::from_args(&args, out)?;
let file = args.iter().filter(|a| !a.starts_with('-')).nth(1).copied().unwrap_or("main.hb");
hblang::run_compiler(file, opts, out, warnings)
}
log::set_logger(&hblang::fs::Logger).unwrap();
log::set_max_level(log::LevelFilter::Error);
let mut out = Vec::new();
let mut warnings = String::new();
match run(&mut out, &mut warnings) {
Ok(_) => {
std::io::stderr().write_all(warnings.as_bytes()).unwrap();
std::io::stdout().write_all(&out).unwrap()
}
Err(_) => {
std::io::stderr().write_all(warnings.as_bytes()).unwrap();
std::io::stderr().write_all(&out).unwrap();
std::process::exit(1);
}
}
}

1672
lang/src/parser.rs Normal file

File diff suppressed because it is too large Load diff

5834
lang/src/son.rs Normal file

File diff suppressed because it is too large Load diff

1172
lang/src/son/hbvm.rs Normal file

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,805 @@
use {
super::{HbvmBackend, Nid, Nodes},
crate::{
parser,
reg::{self, Reg},
son::{debug_assert_matches, Kind, ARG_START, MEM, VOID},
ty::{self, Arg, Loc},
utils::BitSet,
PLoc, Sig, Types,
},
alloc::{borrow::ToOwned, vec::Vec},
core::{mem, ops::Range},
hbbytecode::{self as instrs},
};
impl HbvmBackend {
pub(super) fn emit_body_code(
&mut self,
nodes: &Nodes,
sig: Sig,
tys: &Types,
files: &[parser::Ast],
) -> (usize, bool) {
let tail = Function::build(nodes, tys, &mut self.ralloc, sig);
nodes.basic_blocks();
let strip_load = |value| match nodes[value].kind {
Kind::Load { .. } if nodes[value].ty.loc(tys) == Loc::Stack => nodes[value].inputs[1],
_ => value,
};
let mut res = mem::take(&mut self.ralloc);
Regalloc::run(nodes, &mut res);
'_open_function: {
self.emit(instrs::addi64(reg::STACK_PTR, reg::STACK_PTR, 0));
self.emit(instrs::st(reg::RET_ADDR + tail as u8, reg::STACK_PTR, 0, 0));
}
if let Some(PLoc::Ref(..)) = tys.parama(sig.ret).0 {
res.node_to_reg[MEM as usize] = res.bundles.len() as u8 + 1;
res.bundles.push(Bundle::new(0));
}
let reg_offset = if tail { reg::RET + 12 } else { reg::RET_ADDR + 1 };
let bundle_count = res.bundles.len() + (reg_offset as usize);
res.node_to_reg.iter_mut().filter(|r| **r != 0).for_each(|r| {
if *r == u8::MAX {
*r = 0
} else {
*r += reg_offset - 1;
if tail && *r >= reg::RET_ADDR {
*r += 1;
}
}
});
debug_assert!(!res
.node_to_reg
.iter()
.any(|&a| a == reg::RET_ADDR || (reg::RET..reg_offset - 1).contains(&a)));
let atr = |allc: Nid| {
let allc = strip_load(allc);
debug_assert_eq!(
nodes[allc].lock_rc.get(),
0,
"{:?} {}",
nodes[allc],
ty::Display::new(tys, files, nodes[allc].ty)
);
res.node_to_reg[allc as usize]
};
let (retl, mut parama) = tys.parama(sig.ret);
let mut typs = sig.args.args();
let mut args = nodes[VOID].outputs[ARG_START..].iter();
while let Some(aty) = typs.next(tys) {
let Arg::Value(ty) = aty else { continue };
let Some(loc) = parama.next(ty, tys) else { continue };
let &arg = args.next().unwrap();
let (rg, size) = match loc {
PLoc::WideReg(rg, size) => (rg, size),
PLoc::Reg(rg, size) if ty.loc(tys) == Loc::Stack => (rg, size),
PLoc::Reg(r, ..) | PLoc::Ref(r, ..) => {
self.emit_cp(atr(arg), r);
continue;
}
};
self.emit(instrs::st(rg, reg::STACK_PTR, self.offsets[arg as usize] as _, size));
if nodes.is_unlocked(arg) {
self.emit(instrs::addi64(rg, reg::STACK_PTR, self.offsets[arg as usize] as _));
}
self.emit_cp(atr(arg), rg);
}
let mut alloc_buf = vec![];
for (i, block) in res.blocks.iter().enumerate() {
self.offsets[block.entry as usize] = self.code.len() as _;
for &nid in &res.instrs[block.range()] {
if nid == VOID {
continue;
}
let node = &nodes[nid];
alloc_buf.clear();
let atr = |allc: Nid| {
let allc = strip_load(allc);
debug_assert_eq!(
nodes[allc].lock_rc.get(),
0,
"{:?} {}",
nodes[allc],
ty::Display::new(tys, files, nodes[allc].ty)
);
#[cfg(debug_assertions)]
debug_assert!(
res.marked.contains(&(allc, nid))
|| nid == allc
|| nodes.is_hard_zero(allc)
|| allc == MEM
|| matches!(node.kind, Kind::Loop | Kind::Region),
"{nid} {:?}\n{allc} {:?}",
nodes[nid],
nodes[allc]
);
res.node_to_reg[allc as usize]
};
let mut is_next_block = false;
match node.kind {
Kind::If => {
let &[_, cnd] = node.inputs.as_slice() else { unreachable!() };
if nodes.cond_op(cnd).is_some() {
let &[_, lh, rh] = nodes[cnd].inputs.as_slice() else { unreachable!() };
alloc_buf.extend([atr(lh), atr(rh)]);
} else {
alloc_buf.push(atr(cnd));
}
}
Kind::Loop | Kind::Region => {
let index = node
.inputs
.iter()
.position(|&n| block.entry == nodes.idom_of(n))
.unwrap()
+ 1;
let mut moves = vec![];
for &out in node.outputs.iter() {
if nodes[out].is_data_phi() {
let src = nodes[out].inputs[index];
if atr(out) != atr(src) {
moves.push([atr(out), atr(src), 0]);
}
}
}
debug_assert_eq!(moves.len(), {
moves.sort_unstable();
moves.dedup();
moves.len()
});
moves.sort_unstable_by(|[aa, ab, _], [ba, bb, _]| {
if aa == bb && ab == ba {
core::cmp::Ordering::Equal
} else if aa == bb {
core::cmp::Ordering::Greater
} else {
core::cmp::Ordering::Less
}
});
moves.dedup_by(|[aa, ab, _], [ba, bb, kind]| {
if aa == bb && ab == ba {
*kind = 1;
true
} else {
false
}
});
for [dst, src, kind] in moves {
if kind == 0 {
self.emit(instrs::cp(dst, src));
} else {
self.emit(instrs::swa(dst, src));
}
}
is_next_block = res.backrefs[nid as usize] as usize == i + 1;
}
Kind::Return { .. } => {
let &[_, ret, ..] = node.inputs.as_slice() else { unreachable!() };
match retl {
Some(PLoc::Reg(r, _)) if sig.ret.loc(tys) == Loc::Reg => {
alloc_buf.push(atr(ret));
self.emit(instrs::cp(r, atr(ret)));
}
Some(PLoc::Ref(..)) => alloc_buf.extend([atr(ret), atr(MEM)]),
Some(_) => alloc_buf.push(atr(ret)),
None => {}
}
}
Kind::Die => {}
Kind::CInt { .. } => alloc_buf.push(atr(nid)),
Kind::UnOp { .. } => alloc_buf.extend([atr(nid), atr(node.inputs[1])]),
Kind::BinOp { op } => {
let &[.., lhs, rhs] = node.inputs.as_slice() else { unreachable!() };
if let Kind::CInt { .. } = nodes[rhs].kind
&& nodes.is_locked(rhs)
&& op.imm_binop(node.ty).is_some()
{
alloc_buf.extend([atr(nid), atr(lhs)]);
} else {
alloc_buf.extend([atr(nid), atr(lhs), atr(rhs)]);
}
}
Kind::Call { args, .. } => {
let (ret, mut parama) = tys.parama(node.ty);
if ret.is_some() {
alloc_buf.push(atr(nid));
}
let mut args = args.args();
let mut allocs = node.inputs[1..].iter();
while let Some(arg) = args.next(tys) {
let Arg::Value(ty) = arg else { continue };
let Some(loc) = parama.next(ty, tys) else { continue };
let arg = *allocs.next().unwrap();
alloc_buf.push(atr(arg));
match loc {
PLoc::Reg(..) if ty.loc(tys) == Loc::Stack => {}
PLoc::WideReg(..) => alloc_buf.push(0),
PLoc::Reg(r, ..) | PLoc::Ref(r, ..) => {
self.emit(instrs::cp(r, atr(arg)))
}
};
}
if node.ty.loc(tys) == Loc::Stack {
alloc_buf.push(atr(*node.inputs.last().unwrap()));
}
if let Some(PLoc::Ref(r, ..)) = ret {
self.emit(instrs::cp(r, *alloc_buf.last().unwrap()))
}
}
Kind::Stck | Kind::Global { .. } => alloc_buf.push(atr(nid)),
Kind::Load => {
let (region, _) = nodes.strip_offset(node.inputs[1], node.ty, tys);
if node.ty.loc(tys) != Loc::Stack {
alloc_buf.push(atr(nid));
match nodes[region].kind {
Kind::Stck => {}
_ => alloc_buf.push(atr(region)),
}
}
}
Kind::Stre if node.inputs[1] == VOID => {}
Kind::Stre => {
let (region, _) = nodes.strip_offset(node.inputs[2], node.ty, tys);
match nodes[region].kind {
Kind::Stck if node.ty.loc(tys) == Loc::Reg => {
alloc_buf.push(atr(node.inputs[1]))
}
_ => alloc_buf.extend([atr(region), atr(node.inputs[1])]),
}
}
Kind::Mem => {
self.emit(instrs::cp(atr(MEM), reg::RET));
continue;
}
Kind::Arg => {
continue;
}
_ => {}
}
self.emit_instr(super::InstrCtx {
nid,
sig,
is_next_block,
is_last_block: i == res.blocks.len() - 1,
retl,
allocs: &alloc_buf,
nodes,
tys,
files,
});
if let Kind::Call { .. } = node.kind {
let (ret, ..) = tys.parama(node.ty);
match ret {
Some(PLoc::WideReg(..)) => {}
Some(PLoc::Reg(..)) if node.ty.loc(tys) == Loc::Stack => {}
Some(PLoc::Reg(r, ..)) => self.emit_cp(atr(nid), r),
None | Some(PLoc::Ref(..)) => {}
}
}
}
}
self.ralloc = res;
debug_assert!(bundle_count < reg::STACK_PTR as usize, "TODO: spill memory");
debug_assert_eq!(
self.ralloc
.node_to_reg
.iter()
.filter(|&&r| r
> (bundle_count as u8
+ (tail && bundle_count > (reg::RET_ADDR) as usize) as u8))
.copied()
.collect::<Vec<_>>(),
vec![],
"{bundle_count}"
);
(
if tail {
bundle_count.saturating_sub(reg::RET_ADDR as _)
} else {
self.ralloc.bundles.len()
},
tail,
)
}
fn emit_cp(&mut self, dst: Reg, src: Reg) {
if dst != 0 {
self.emit(instrs::cp(dst, src));
}
}
}
struct Function<'a> {
sig: Sig,
tail: bool,
nodes: &'a Nodes,
tys: &'a Types,
func: &'a mut Res,
}
impl core::fmt::Debug for Function<'_> {
fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result {
for block in &self.func.blocks {
writeln!(f, "{:?}", self.nodes[block.entry].kind)?;
for &instr in &self.func.instrs[block.range()] {
writeln!(f, "{:?}", self.nodes[instr].kind)?;
}
}
Ok(())
}
}
impl<'a> Function<'a> {
fn build(nodes: &'a Nodes, tys: &'a Types, func: &'a mut Res, sig: Sig) -> bool {
func.blocks.clear();
func.instrs.clear();
func.backrefs.resize(nodes.values.len(), u16::MAX);
func.visited.clear(nodes.values.len());
let mut s = Self { tail: true, nodes, tys, sig, func };
s.emit_node(VOID);
debug_assert!(s.func.blocks.array_chunks().all(|[a, b]| a.end == b.start));
log::info!("{s:?}");
s.tail
}
fn add_block(&mut self, entry: Nid) {
self.func.blocks.push(Block {
start: self.func.instrs.len() as _,
end: self.func.instrs.len() as _,
entry,
});
self.func.backrefs[entry as usize] = self.func.blocks.len() as u16 - 1;
}
fn close_block(&mut self, exit: Nid) {
if !matches!(self.nodes[exit].kind, Kind::Loop | Kind::Region) {
self.add_instr(exit);
} else {
self.func.instrs.push(exit);
}
let prev = self.func.blocks.last_mut().unwrap();
prev.end = self.func.instrs.len() as _;
}
fn add_instr(&mut self, nid: Nid) {
debug_assert_ne!(self.nodes[nid].kind, Kind::Loop);
self.func.backrefs[nid as usize] = self.func.instrs.len() as u16;
self.func.instrs.push(nid);
}
fn emit_node(&mut self, nid: Nid) {
if matches!(self.nodes[nid].kind, Kind::Region | Kind::Loop) {
match (self.nodes[nid].kind, self.func.visited.set(nid)) {
(Kind::Loop, false) | (Kind::Region, true) => {
self.close_block(nid);
return;
}
_ => {}
}
} else if !self.func.visited.set(nid) {
return;
}
if self.nodes.is_never_used(nid, self.tys) {
self.nodes.lock(nid);
return;
}
let mut node = self.nodes[nid].clone();
match node.kind {
Kind::Start => {
debug_assert_matches!(self.nodes[node.outputs[0]].kind, Kind::Entry);
self.add_block(VOID);
self.emit_node(node.outputs[0])
}
Kind::If => {
let &[_, cnd] = node.inputs.as_slice() else { unreachable!() };
let &[mut then, mut else_] = node.outputs.as_slice() else { unreachable!() };
if let Some((_, swapped)) = self.nodes.cond_op(cnd) {
if swapped {
mem::swap(&mut then, &mut else_);
}
} else {
mem::swap(&mut then, &mut else_);
}
self.close_block(nid);
self.emit_node(then);
self.emit_node(else_);
}
Kind::Region | Kind::Loop => {
self.close_block(nid);
self.add_block(nid);
self.nodes.reschedule_block(nid, &mut node.outputs);
for o in node.outputs.into_iter().rev() {
self.emit_node(o);
}
}
Kind::Return { .. } | Kind::Die => {
self.close_block(nid);
self.emit_node(node.outputs[0]);
}
Kind::Entry => {
let (ret, mut parama) = self.tys.parama(self.sig.ret);
if let Some(PLoc::Ref(..)) = ret {
self.add_instr(MEM);
}
let mut typs = self.sig.args.args();
#[expect(clippy::unnecessary_to_owned)]
let mut args = self.nodes[VOID].outputs[ARG_START..].to_owned().into_iter();
while let Some(ty) = typs.next_value(self.tys) {
let arg = args.next().unwrap();
debug_assert_eq!(self.nodes[arg].kind, Kind::Arg);
match parama.next(ty, self.tys) {
None => {}
Some(_) => self.add_instr(arg),
}
}
self.nodes.reschedule_block(nid, &mut node.outputs);
for o in node.outputs.into_iter().rev() {
self.emit_node(o);
}
}
Kind::Then | Kind::Else => {
self.add_block(nid);
self.nodes.reschedule_block(nid, &mut node.outputs);
for o in node.outputs.into_iter().rev() {
self.emit_node(o);
}
}
Kind::Call { func, .. } => {
self.tail &= func == ty::Func::ECA;
self.add_instr(nid);
self.nodes.reschedule_block(nid, &mut node.outputs);
for o in node.outputs.into_iter().rev() {
if self.nodes[o].inputs[0] == nid
|| (matches!(self.nodes[o].kind, Kind::Loop | Kind::Region)
&& self.nodes[o].inputs[1] == nid)
{
self.emit_node(o);
}
}
}
Kind::CInt { value: 0 } if self.nodes.is_hard_zero(nid) => {}
Kind::CInt { .. }
| Kind::BinOp { .. }
| Kind::UnOp { .. }
| Kind::Global { .. }
| Kind::Load { .. }
| Kind::Stre
| Kind::Stck => self.add_instr(nid),
Kind::End | Kind::Phi | Kind::Arg | Kind::Mem | Kind::Loops | Kind::Join => {}
Kind::Assert { .. } => unreachable!(),
}
}
}
impl Nodes {
fn vreg_count(&self) -> usize {
self.values.len()
}
fn use_block_of(&self, inst: Nid, uinst: Nid) -> Nid {
let mut block = self.use_block(inst, uinst, None);
while !self[block].kind.starts_basic_block() {
block = self.idom(block, None);
}
block
}
fn phi_inputs_of(&self, nid: Nid) -> impl Iterator<Item = [Nid; 3]> + use<'_> {
match self[nid].kind {
Kind::Region | Kind::Loop => Some({
self[nid]
.outputs
.as_slice()
.iter()
.filter(|&&n| self[n].is_data_phi())
.map(|&n| [n, self[n].inputs[1], self[n].inputs[2]])
})
.into_iter()
.flatten(),
_ => None.into_iter().flatten(),
}
}
fn idom_of(&self, mut nid: Nid) -> Nid {
while !self[nid].kind.starts_basic_block() {
nid = self.idom(nid, None);
}
nid
}
fn uses_of(&self, nid: Nid) -> impl Iterator<Item = (Nid, Nid)> + use<'_> {
if self[nid].kind.is_cfg() && !matches!(self[nid].kind, Kind::Call { .. }) {
return None.into_iter().flatten();
}
Some(
self[nid]
.outputs
.iter()
.filter(move |&&n| self.is_data_dep(nid, n))
.map(move |n| self.this_or_delegates(nid, n))
.flat_map(|(p, ls)| ls.iter().map(move |l| (p, l)))
.filter(|&(o, &n)| self.is_data_dep(o, n))
.map(|(p, &n)| (self.use_block_of(p, n), n))
.inspect(|&(_, n)| debug_assert_eq!(self[n].lock_rc.get(), 0)),
)
.into_iter()
.flatten()
}
}
struct Regalloc<'a> {
nodes: &'a Nodes,
res: &'a mut Res,
}
impl<'a> Regalloc<'a> {
fn instr_of(&self, nid: Nid) -> Option<Nid> {
if self.nodes[nid].kind == Kind::Phi || self.nodes.is_locked(nid) {
return None;
}
debug_assert_ne!(self.res.backrefs[nid as usize], Nid::MAX, "{:?}", self.nodes[nid]);
Some(self.res.backrefs[nid as usize])
}
fn block_of(&self, nid: Nid) -> Nid {
debug_assert!(self.nodes[nid].kind.starts_basic_block());
self.res.backrefs[nid as usize]
}
fn run(ctx: &'a Nodes, res: &'a mut Res) {
Self { nodes: ctx, res }.run_low();
}
fn run_low(&mut self) {
self.res.bundles.clear();
self.res.node_to_reg.clear();
#[cfg(debug_assertions)]
self.res.marked.clear();
self.res.node_to_reg.resize(self.nodes.vreg_count(), 0);
debug_assert!(self.res.dfs_buf.is_empty());
let mut bundle = Bundle::new(self.res.instrs.len());
self.res.visited.clear(self.nodes.values.len());
for i in (0..self.res.blocks.len()).rev() {
for [a, rest @ ..] in self.nodes.phi_inputs_of(self.res.blocks[i].entry) {
if self.res.visited.set(a) {
self.append_bundle(a, &mut bundle, None);
}
for r in rest {
if !self.res.visited.set(r) {
continue;
}
self.append_bundle(
r,
&mut bundle,
Some(self.res.node_to_reg[a as usize] as usize - 1),
);
}
}
}
let instrs = mem::take(&mut self.res.instrs);
for &inst in &instrs {
if self.nodes[inst].has_no_value() || self.res.visited.get(inst) || inst == 0 {
continue;
}
self.append_bundle(inst, &mut bundle, None);
}
self.res.instrs = instrs;
}
fn collect_bundle(&mut self, inst: Nid, into: &mut Bundle) {
let dom = self.nodes.idom_of(inst);
self.res.dfs_seem.clear(self.nodes.values.len());
for (cursor, uinst) in self.nodes.uses_of(inst) {
if !self.res.dfs_seem.set(uinst) {
continue;
}
#[cfg(debug_assertions)]
debug_assert!(self.res.marked.insert((inst, uinst)));
self.reverse_cfg_dfs(cursor, dom, |s, n, b| {
let mut range = b.range();
debug_assert!(range.start < range.end);
range.start = range.start.max(s.instr_of(inst).map_or(0, |n| n + 1) as usize);
debug_assert!(
range.start < range.end,
"{:?} {:?} {n} {inst}",
range,
self.nodes[inst]
);
let new = range.end.min(
s.instr_of(uinst)
.filter(|_| {
n == cursor
&& self.nodes.loop_depth(dom, None)
== self.nodes.loop_depth(cursor, None)
})
.map_or(Nid::MAX, |n| n + 1) as usize,
);
range.end = new;
debug_assert!(range.start < range.end, "{:?} {inst} {uinst}", range);
into.add(range);
});
}
}
fn append_bundle(&mut self, inst: Nid, tmp: &mut Bundle, prefered: Option<usize>) {
self.collect_bundle(inst, tmp);
if tmp.is_empty() {
self.res.node_to_reg[inst as usize] = u8::MAX;
return;
}
if let Some(prefered) = prefered
&& !self.res.bundles[prefered].overlaps(tmp)
{
self.res.bundles[prefered].merge(tmp);
tmp.clear();
self.res.node_to_reg[inst as usize] = prefered as Reg + 1;
return;
}
match self.res.bundles.iter_mut().enumerate().find(|(_, b)| !b.overlaps(tmp)) {
Some((i, other)) => {
other.merge(tmp);
tmp.clear();
self.res.node_to_reg[inst as usize] = i as Reg + 1;
}
None => {
self.res.bundles.push(tmp.take());
self.res.node_to_reg[inst as usize] = self.res.bundles.len() as Reg;
}
}
}
fn reverse_cfg_dfs(
&mut self,
from: Nid,
until: Nid,
mut each: impl FnMut(&mut Self, Nid, Block),
) {
debug_assert!(self.res.dfs_buf.is_empty());
self.res.dfs_buf.push(from);
debug_assert!(self.nodes.dominates(until, from, None));
while let Some(nid) = self.res.dfs_buf.pop() {
debug_assert!(
self.nodes.dominates(until, nid, None),
"{until} {:?}",
self.nodes[until]
);
each(self, nid, self.res.blocks[self.block_of(nid) as usize]);
if nid == until {
continue;
}
match self.nodes[nid].kind {
Kind::Then | Kind::Else | Kind::Region | Kind::Loop => {
for &n in self.nodes[nid].inputs.iter() {
if self.nodes[n].kind == Kind::Loops {
continue;
}
let d = self.nodes.idom_of(n);
if self.res.dfs_seem.set(d) {
self.res.dfs_buf.push(d);
}
}
}
Kind::Start => {}
_ => unreachable!(),
}
}
}
}
#[derive(Default)]
pub(super) struct Res {
blocks: Vec<Block>,
instrs: Vec<Nid>,
backrefs: Vec<u16>,
bundles: Vec<Bundle>,
node_to_reg: Vec<Reg>,
visited: BitSet,
dfs_buf: Vec<Nid>,
dfs_seem: BitSet,
#[cfg(debug_assertions)]
marked: hashbrown::HashSet<(Nid, Nid), crate::FnvBuildHasher>,
}
struct Bundle {
taken: Vec<bool>,
}
impl Bundle {
fn new(size: usize) -> Self {
Self { taken: vec![false; size] }
}
fn add(&mut self, range: Range<usize>) {
self.taken[range].fill(true);
}
fn overlaps(&self, other: &Self) -> bool {
self.taken.iter().zip(other.taken.iter()).any(|(a, b)| a & b)
}
fn merge(&mut self, other: &Self) {
debug_assert!(!self.overlaps(other));
self.taken.iter_mut().zip(other.taken.iter()).for_each(|(a, b)| *a |= *b);
}
fn clear(&mut self) {
self.taken.fill(false);
}
fn is_empty(&self) -> bool {
!self.taken.contains(&true)
}
fn take(&mut self) -> Self {
mem::replace(self, Self::new(self.taken.len()))
}
}
#[derive(Clone, Copy)]
struct Block {
start: u16,
end: u16,
entry: Nid,
}
impl Block {
pub fn range(&self) -> Range<usize> {
self.start as usize..self.end as usize
}
}

649
lang/src/utils.rs Normal file
View file

@ -0,0 +1,649 @@
#![expect(dead_code)]
use {
alloc::alloc,
core::{
alloc::Layout,
fmt::Debug,
hint::unreachable_unchecked,
marker::PhantomData,
mem::MaybeUninit,
ops::{Deref, DerefMut, Not},
ptr::Unique,
},
};
fn decide(b: bool, name: &'static str) -> Result<(), &'static str> {
b.then_some(()).ok_or(name)
}
pub fn is_snake_case(str: &str) -> Result<(), &'static str> {
decide(str.bytes().all(|c| matches!(c, b'a'..=b'z' | b'0'..=b'9' | b'_')), "snake_case")
}
pub fn is_pascal_case(str: &str) -> Result<(), &'static str> {
decide(
str.as_bytes()[0].is_ascii_uppercase() && str.bytes().all(|c| c.is_ascii_alphanumeric()),
"PascalCase",
)
}
pub fn is_screaming_case(str: &str) -> Result<(), &'static str> {
decide(str.bytes().all(|c| matches!(c, b'A'..=b'Z' | b'0'..=b'9' | b'_')), "SCREAMING_CASE")
}
type Nid = u16;
pub union BitSet {
inline: usize,
alloced: Unique<AllocedBitSet>,
}
impl Debug for BitSet {
fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result {
f.debug_list().entries(self.iter()).finish()
}
}
impl Clone for BitSet {
fn clone(&self) -> Self {
if self.is_inline() {
Self { inline: unsafe { self.inline } }
} else {
let (data, _) = self.data_and_len();
let (layout, _) = Self::layout(data.len());
unsafe {
let ptr = alloc::alloc(layout);
ptr.copy_from_nonoverlapping(self.alloced.as_ptr() as _, layout.size());
Self { alloced: Unique::new_unchecked(ptr as _) }
}
}
}
}
impl Drop for BitSet {
fn drop(&mut self) {
if !self.is_inline() {
unsafe {
let cap = self.alloced.as_ref().cap;
alloc::dealloc(self.alloced.as_ptr() as _, Self::layout(cap).0);
}
}
}
}
impl Default for BitSet {
fn default() -> Self {
Self { inline: Self::FLAG }
}
}
impl BitSet {
const FLAG: usize = 1 << (Self::UNIT - 1);
const INLINE_ELEMS: usize = Self::UNIT - 1;
const UNIT: usize = core::mem::size_of::<usize>() * 8;
pub fn with_capacity(len: usize) -> Self {
let mut s = Self::default();
s.reserve(len);
s
}
fn is_inline(&self) -> bool {
unsafe { self.inline & Self::FLAG != 0 }
}
fn data_and_len(&self) -> (&[usize], usize) {
unsafe {
if self.is_inline() {
(core::slice::from_ref(&self.inline), Self::INLINE_ELEMS)
} else {
let small_vec = self.alloced.as_ref();
(
core::slice::from_raw_parts(
&small_vec.data as *const _ as *const usize,
small_vec.cap,
),
small_vec.cap * core::mem::size_of::<usize>() * 8,
)
}
}
}
fn data_mut_and_len(&mut self) -> (&mut [usize], usize) {
unsafe {
if self.is_inline() {
(core::slice::from_mut(&mut self.inline), INLINE_ELEMS)
} else {
let small_vec = self.alloced.as_mut();
(
core::slice::from_raw_parts_mut(
&mut small_vec.data as *mut _ as *mut usize,
small_vec.cap,
),
small_vec.cap * Self::UNIT,
)
}
}
}
fn indexes(index: usize) -> (usize, usize) {
(index / Self::UNIT, index % Self::UNIT)
}
pub fn get(&self, index: Nid) -> bool {
let index = index as usize;
let (data, len) = self.data_and_len();
if index >= len {
return false;
}
let (elem, bit) = Self::indexes(index);
(unsafe { *data.get_unchecked(elem) }) & (1 << bit) != 0
}
pub fn set(&mut self, index: Nid) -> bool {
let index = index as usize;
let (mut data, len) = self.data_mut_and_len();
if core::intrinsics::unlikely(index >= len) {
self.grow((index + 1).next_power_of_two().max(4 * Self::UNIT));
(data, _) = self.data_mut_and_len();
}
let (elem, bit) = Self::indexes(index);
debug_assert!(elem < data.len(), "{} < {}", elem, data.len());
let elem = unsafe { data.get_unchecked_mut(elem) };
let prev = *elem;
*elem |= 1 << bit;
*elem != prev
}
fn grow(&mut self, size: usize) {
debug_assert!(size.is_power_of_two());
let slot_count = size / Self::UNIT;
let (layout, off) = Self::layout(slot_count);
let (ptr, prev_len) = unsafe {
if self.is_inline() {
let ptr = alloc::alloc(layout);
*ptr.add(off).cast::<usize>() = self.inline & !Self::FLAG;
(ptr, 1)
} else {
let prev_len = self.alloced.as_ref().cap;
let (prev_layout, _) = Self::layout(prev_len);
(alloc::realloc(self.alloced.as_ptr() as _, prev_layout, layout.size()), prev_len)
}
};
unsafe {
MaybeUninit::fill(
core::slice::from_raw_parts_mut(
ptr.add(off).cast::<MaybeUninit<usize>>().add(prev_len),
slot_count - prev_len,
),
0,
);
*ptr.cast::<usize>() = slot_count;
core::ptr::write(self, Self { alloced: Unique::new_unchecked(ptr as _) });
}
}
fn layout(slot_count: usize) -> (core::alloc::Layout, usize) {
unsafe {
core::alloc::Layout::new::<AllocedBitSet>()
.extend(Layout::array::<usize>(slot_count).unwrap_unchecked())
.unwrap_unchecked()
}
}
pub fn iter(&self) -> BitSetIter {
if self.is_inline() {
BitSetIter { index: 0, current: unsafe { self.inline & !Self::FLAG }, remining: &[] }
} else {
let &[current, ref remining @ ..] = self.data_and_len().0 else {
unsafe { unreachable_unchecked() }
};
BitSetIter { index: 0, current, remining }
}
}
pub fn clear(&mut self, len: usize) {
self.reserve(len);
if self.is_inline() {
unsafe { self.inline &= Self::FLAG };
} else {
self.data_mut_and_len().0.fill(0);
}
}
pub fn units<'a>(&'a self, slot: &'a mut usize) -> &'a [usize] {
if self.is_inline() {
*slot = unsafe { self.inline } & !Self::FLAG;
core::slice::from_ref(slot)
} else {
self.data_and_len().0
}
}
pub fn reserve(&mut self, len: usize) {
if len > self.data_and_len().1 {
self.grow(len.next_power_of_two().max(4 * Self::UNIT));
}
}
pub fn units_mut(&mut self) -> Result<&mut [usize], &mut InlineBitSetView> {
if self.is_inline() {
Err(unsafe {
core::mem::transmute::<&mut usize, &mut InlineBitSetView>(&mut self.inline)
})
} else {
Ok(self.data_mut_and_len().0)
}
}
}
pub struct InlineBitSetView(usize);
impl InlineBitSetView {
pub(crate) fn add_mask(&mut self, tmp: usize) {
debug_assert!(tmp & BitSet::FLAG == 0);
self.0 |= tmp;
}
}
pub struct BitSetIter<'a> {
index: usize,
current: usize,
remining: &'a [usize],
}
impl Iterator for BitSetIter<'_> {
type Item = usize;
fn next(&mut self) -> Option<Self::Item> {
while self.current == 0 {
self.current = *self.remining.take_first()?;
self.index += 1;
}
let sub_idx = self.current.trailing_zeros() as usize;
self.current &= self.current - 1;
Some(self.index * BitSet::UNIT + sub_idx)
}
}
struct AllocedBitSet {
cap: usize,
data: [usize; 0],
}
#[cfg(test)]
#[test]
fn test_small_bit_set() {
use std::vec::Vec;
let mut sv = BitSet::default();
sv.set(10);
debug_assert!(sv.get(10));
sv.set(100);
debug_assert!(sv.get(100));
sv.set(10000);
debug_assert!(sv.get(10000));
debug_assert_eq!(sv.iter().collect::<Vec<_>>(), &[10, 100, 10000]);
sv.clear(10000);
debug_assert_eq!(sv.iter().collect::<Vec<_>>(), &[]);
}
pub union Vc {
inline: InlineVc,
alloced: AllocedVc,
}
impl Default for Vc {
fn default() -> Self {
Vc { inline: InlineVc { elems: MaybeUninit::uninit(), cap: Default::default() } }
}
}
impl Debug for Vc {
fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result {
self.as_slice().fmt(f)
}
}
impl FromIterator<Nid> for Vc {
fn from_iter<T: IntoIterator<Item = Nid>>(iter: T) -> Self {
let mut slf = Self::default();
for i in iter {
slf.push(i);
}
slf
}
}
const INLINE_ELEMS: usize = VC_SIZE / 2 - 1;
const VC_SIZE: usize = 16;
impl Vc {
fn is_inline(&self) -> bool {
unsafe { self.inline.cap <= INLINE_ELEMS as Nid }
}
fn layout(&self) -> Option<core::alloc::Layout> {
unsafe {
self.is_inline().not().then(|| {
core::alloc::Layout::array::<Nid>(self.alloced.cap as _).unwrap_unchecked()
})
}
}
pub fn len(&self) -> usize {
unsafe {
if self.is_inline() {
self.inline.cap as _
} else {
self.alloced.len as _
}
}
}
fn len_mut(&mut self) -> &mut Nid {
unsafe {
if self.is_inline() {
&mut self.inline.cap
} else {
&mut self.alloced.len
}
}
}
fn as_ptr(&self) -> *const Nid {
unsafe {
match self.is_inline() {
true => self.inline.elems.as_ptr().cast(),
false => self.alloced.base.as_ptr(),
}
}
}
fn as_mut_ptr(&mut self) -> *mut Nid {
unsafe {
match self.is_inline() {
true => self.inline.elems.as_mut_ptr().cast(),
false => self.alloced.base.as_ptr(),
}
}
}
pub fn as_slice(&self) -> &[Nid] {
unsafe { core::slice::from_raw_parts(self.as_ptr(), self.len()) }
}
fn as_slice_mut(&mut self) -> &mut [Nid] {
unsafe { core::slice::from_raw_parts_mut(self.as_mut_ptr(), self.len()) }
}
pub fn push(&mut self, value: Nid) {
if let Some(layout) = self.layout()
&& unsafe { self.alloced.len == self.alloced.cap }
{
unsafe {
self.alloced.cap *= 2;
self.alloced.base = Unique::new_unchecked(
alloc::realloc(
self.alloced.base.as_ptr().cast(),
layout,
self.alloced.cap as usize * core::mem::size_of::<Nid>(),
)
.cast(),
);
}
} else if self.len() == INLINE_ELEMS {
unsafe {
let mut allcd =
Self::alloc((self.inline.cap + 1).next_power_of_two() as _, self.len());
core::ptr::copy_nonoverlapping(self.as_ptr(), allcd.as_mut_ptr(), self.len());
*self = allcd;
}
}
unsafe {
*self.len_mut() += 1;
self.as_mut_ptr().add(self.len() - 1).write(value);
}
}
unsafe fn alloc(cap: usize, len: usize) -> Self {
debug_assert!(cap > INLINE_ELEMS);
let layout = unsafe { core::alloc::Layout::array::<Nid>(cap).unwrap_unchecked() };
let alloc = unsafe { alloc::alloc(layout) };
unsafe {
Vc {
alloced: AllocedVc {
base: Unique::new_unchecked(alloc.cast()),
len: len as _,
cap: cap as _,
},
}
}
}
pub fn swap_remove(&mut self, index: usize) {
let len = self.len() - 1;
self.as_slice_mut().swap(index, len);
*self.len_mut() -= 1;
}
pub fn remove(&mut self, index: usize) {
self.as_slice_mut().copy_within(index + 1.., index);
*self.len_mut() -= 1;
}
}
impl Drop for Vc {
fn drop(&mut self) {
if let Some(layout) = self.layout() {
unsafe {
alloc::dealloc(self.alloced.base.as_ptr().cast(), layout);
}
}
}
}
impl Clone for Vc {
fn clone(&self) -> Self {
self.as_slice().into()
}
}
impl IntoIterator for Vc {
type IntoIter = VcIntoIter;
type Item = Nid;
fn into_iter(self) -> Self::IntoIter {
VcIntoIter { start: 0, end: self.len(), vc: self }
}
}
pub struct VcIntoIter {
start: usize,
end: usize,
vc: Vc,
}
impl Iterator for VcIntoIter {
type Item = Nid;
fn next(&mut self) -> Option<Self::Item> {
if self.start == self.end {
return None;
}
let ret = unsafe { core::ptr::read(self.vc.as_slice().get_unchecked(self.start)) };
self.start += 1;
Some(ret)
}
fn size_hint(&self) -> (usize, Option<usize>) {
let len = self.end - self.start;
(len, Some(len))
}
}
impl DoubleEndedIterator for VcIntoIter {
fn next_back(&mut self) -> Option<Self::Item> {
if self.start == self.end {
return None;
}
self.end -= 1;
Some(unsafe { core::ptr::read(self.vc.as_slice().get_unchecked(self.end)) })
}
}
impl ExactSizeIterator for VcIntoIter {}
impl<const SIZE: usize> From<[Nid; SIZE]> for Vc {
fn from(value: [Nid; SIZE]) -> Self {
value.as_slice().into()
}
}
impl<'a> From<&'a [Nid]> for Vc {
fn from(value: &'a [Nid]) -> Self {
if value.len() <= INLINE_ELEMS {
let mut dflt = Self::default();
unsafe {
core::ptr::copy_nonoverlapping(value.as_ptr(), dflt.as_mut_ptr(), value.len())
};
dflt.inline.cap = value.len() as _;
dflt
} else {
let mut allcd = unsafe { Self::alloc(value.len(), value.len()) };
unsafe {
core::ptr::copy_nonoverlapping(value.as_ptr(), allcd.as_mut_ptr(), value.len())
};
allcd
}
}
}
impl Deref for Vc {
type Target = [Nid];
fn deref(&self) -> &Self::Target {
self.as_slice()
}
}
impl DerefMut for Vc {
fn deref_mut(&mut self) -> &mut Self::Target {
self.as_slice_mut()
}
}
#[derive(Clone, Copy)]
#[repr(C)]
struct InlineVc {
cap: Nid,
elems: MaybeUninit<[Nid; INLINE_ELEMS]>,
}
#[derive(Clone, Copy)]
#[repr(C)]
struct AllocedVc {
cap: Nid,
len: Nid,
base: Unique<Nid>,
}
pub trait Ent: Copy {
fn new(index: usize) -> Self;
fn index(self) -> usize;
}
pub struct EntVec<K: Ent, T> {
data: ::alloc::vec::Vec<T>,
k: PhantomData<fn(K)>,
}
impl<K: Ent, T> Default for EntVec<K, T> {
fn default() -> Self {
Self { data: Default::default(), k: PhantomData }
}
}
impl<K: Ent, T> EntVec<K, T> {
pub fn clear(&mut self) {
self.data.clear();
}
pub fn is_empty(&self) -> bool {
self.data.is_empty()
}
pub fn len(&self) -> usize {
self.data.len()
}
pub fn push(&mut self, value: T) -> K {
let k = K::new(self.data.len());
self.data.push(value);
k
}
pub fn next(&self, index: K) -> Option<&T> {
self.data.get(index.index() + 1)
}
pub fn shadow(&mut self, len: usize)
where
T: Default,
{
if self.data.len() < len {
self.data.resize_with(len, Default::default);
}
}
pub fn iter(&self) -> core::slice::Iter<T> {
self.data.iter()
}
}
impl<K: Ent, T> core::ops::Index<K> for EntVec<K, T> {
type Output = T;
fn index(&self, index: K) -> &Self::Output {
&self.data[index.index()]
}
}
impl<K: Ent, T> core::ops::IndexMut<K> for EntVec<K, T> {
fn index_mut(&mut self, index: K) -> &mut Self::Output {
&mut self.data[index.index()]
}
}
macro_rules! decl_ent {
($(
$vis:vis struct $name:ident($index:ty);
)*) => {$(
#[derive(Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Debug)]
$vis struct $name($index);
impl crate::utils::Ent for $name {
fn new(index: usize) -> Self {
Self(index as $index)
}
fn index(self) -> usize {
self.0 as _
}
}
impl core::fmt::Display for $name {
fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result {
write!(f, concat!(stringify!($name), "{}"), self.0)
}
}
)*};
}
pub(crate) use decl_ent;

View file

@ -0,0 +1,48 @@
main:
ADDI64 r254, r254, -24d
ST r31, r254, 0a, 24h
LI32 r32, 1148846080w
CP r2, r32
JAL r31, r0, :sin
CP r33, r1
FMUL32 r32, r33, r32
FTI32 r32, r32, 1b
CP r1, r32
LD r31, r254, 0a, 24h
ADDI64 r254, r254, 24d
JALA r0, r31, 0a
sin:
CP r13, r2
LI32 r14, 1124073472w
LI32 r15, 1078530011w
FMUL32 r14, r13, r14
FDIV32 r14, r14, r15
FTI32 r14, r14, 1b
ANDI r15, r14, 255d
ITF64 r16, r14
MULI64 r15, r15, 4d
LRA r17, r0, :sin_table
LI32 r18, 1086918619w
FC64T32 r16, r16, 1b
ADDI64 r14, r14, 64d
ADD64 r15, r17, r15
LI32 r19, 1132462080w
FMUL32 r16, r16, r18
ANDI r14, r14, 255d
LI32 r18, 1056964608w
LD r15, r15, 0a, 4h
FDIV32 r16, r16, r19
MULI64 r14, r14, 4d
FMUL32 r18, r15, r18
FSUB32 r13, r13, r16
ADD64 r14, r17, r14
FMUL32 r16, r13, r18
LD r14, r14, 0a, 4h
FSUB32 r14, r14, r16
FMUL32 r13, r14, r13
FADD32 r13, r15, r13
CP r1, r13
JALA r0, r31, 0a
code size: 1315
ret: 826
status: Ok(())

View file

@ -0,0 +1,6 @@
main:
CP r1, r0
JALA r0, r31, 0a
code size: 22
ret: 0
status: Ok(())

View file

@ -0,0 +1,6 @@
main:
CP r1, r0
JALA r0, r31, 0a
code size: 22
ret: 0
status: Ok(())

View file

@ -0,0 +1,32 @@
main:
ADDI64 r254, r254, -56d
ST r31, r254, 24a, 32h
LI64 r32, 1d
ADDI64 r33, r254, 0d
ST r32, r254, 0a, 8h
LI64 r34, 2d
ST r34, r254, 8a, 8h
LI64 r34, 4d
ST r34, r254, 16a, 8h
CP r2, r33
JAL r31, r0, :pass
CP r33, r1
ADD64 r32, r33, r32
CP r1, r32
LD r31, r254, 24a, 32h
ADDI64 r254, r254, 56d
JALA r0, r31, 0a
pass:
CP r13, r2
LD r14, r13, 8a, 8h
MULI64 r15, r14, 8d
LD r16, r13, 0a, 8h
ADD64 r13, r15, r13
ADD64 r14, r14, r16
LD r13, r13, 0a, 8h
ADD64 r13, r13, r14
CP r1, r13
JALA r0, r31, 0a
code size: 246
ret: 8
status: Ok(())

View file

@ -0,0 +1,8 @@
main:
LRA r13, r0, :sin_table
LD r13, r13, 80a, 8h
CP r1, r13
JALA r0, r31, 0a
code size: 770
ret: 1736
status: Ok(())

View file

@ -0,0 +1,14 @@
main:
CP r14, r2
LI64 r13, 1d
JNE r14, r13, :0
JMP :1
0: JNE r14, r0, :2
LI64 r13, 2d
JMP :1
2: LI64 r13, 3d
1: CP r1, r13
JALA r0, r31, 0a
code size: 75
ret: 2
status: Ok(())

View file

@ -0,0 +1,32 @@
main:
ADDI64 r254, r254, -24d
ST r31, r254, 0a, 24h
LRA r32, r0, :"abඞ\n\r\t56789\0"
CP r2, r32
JAL r31, r0, :str_len
CP r32, r1
LRA r33, r0, :"fff\0"
CP r2, r33
JAL r31, r0, :str_len
CP r33, r1
ADD64 r32, r33, r32
CP r1, r32
LD r31, r254, 0a, 24h
ADDI64 r254, r254, 24d
JALA r0, r31, 0a
str_len:
CP r15, r2
CP r14, r0
CP r13, r14
2: LD r16, r15, 0a, 1h
ANDI r16, r16, 255d
JNE r16, r14, :0
CP r1, r13
JMP :1
0: ADDI64 r15, r15, 1d
ADDI64 r13, r13, 1d
JMP :2
1: JALA r0, r31, 0a
code size: 216
ret: 16
status: Ok(())

View file

@ -0,0 +1,13 @@
foo:
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -8d
ST r31, r254, 0a, 8h
JAL r31, r0, :foo
CP r1, r0
LD r31, r254, 0a, 8h
ADDI64 r254, r254, 8d
JALA r0, r31, 0a
code size: 88
ret: 0
status: Ok(())

View file

@ -0,0 +1,8 @@
main:
LRA r13, r0, :a
LD r13, r13, 0a, 8h
CP r1, r13
JALA r0, r31, 0a
code size: 50
ret: 50
status: Ok(())

View file

@ -0,0 +1,8 @@
main:
LRA r13, r0, :a
LD r13, r13, 0a, 8h
CP r1, r13
JALA r0, r31, 0a
code size: 50
ret: 50
status: Ok(())

View file

@ -0,0 +1,19 @@
cond:
CP r1, r0
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -24d
ST r31, r254, 0a, 24h
JAL r31, r0, :cond
CP r33, r1
CP r32, r0
JNE r33, r32, :0
JMP :1
0: LI64 r32, 2d
1: CP r1, r32
LD r31, r254, 0a, 24h
ADDI64 r254, r254, 24d
JALA r0, r31, 0a
code size: 117
ret: 0
status: Ok(())

View file

@ -0,0 +1,6 @@
main:
CP r1, r0
JALA r0, r31, 0a
code size: 22
ret: 0
status: Ok(())

View file

@ -0,0 +1,7 @@
main:
LI32 r13, 69w
CP r1, r13
JALA r0, r31, 0a
code size: 28
ret: 69
status: Ok(())

View file

@ -0,0 +1,6 @@
main:
CP r1, r0
JALA r0, r31, 0a
code size: 22
ret: 0
status: Ok(())

View file

@ -0,0 +1,19 @@
main:
LI64 r15, 3d
LI64 r16, 10d
CP r14, r0
CP r13, r14
3: JNE r13, r16, :0
LI64 r14, -10d
ADD64 r14, r13, r14
CP r1, r14
JMP :1
0: DIRU64 r0, r17, r13, r15
JNE r17, r14, :2
JMP :2
2: ADDI64 r13, r13, 1d
JMP :3
1: JALA r0, r31, 0a
code size: 103
ret: 0
status: Ok(())

View file

@ -0,0 +1,5 @@
main:
UN
code size: 9
ret: 0
status: Err(Unreachable)

View file

@ -0,0 +1,77 @@
main:
ADDI64 r254, r254, -104d
ST r31, r254, 48a, 56h
LRA r32, r0, :glob_stru
JAL r31, r0, :new_stru
ST r1, r32, 0a, 16h
CP r33, r0
LD r34, r32, 0a, 8h
JEQ r34, r33, :0
LI64 r32, 300d
CP r1, r32
JMP :1
0: ST r33, r32, 0a, 8h
LD r34, r32, 0a, 8h
JEQ r34, r33, :2
LI64 r32, 200d
CP r1, r32
JMP :1
2: LI64 r34, 1d
ST r34, r32, 0a, 8h
ST r34, r32, 8a, 8h
ADDI64 r35, r254, 0d
ST r34, r254, 0a, 8h
ST r34, r254, 8a, 8h
ST r34, r254, 16a, 8h
ST r34, r254, 24a, 8h
ST r34, r254, 32a, 8h
ST r34, r254, 40a, 8h
LI64 r36, 3d
CP r32, r33
8: JNE r32, r36, :3
LD r32, r254, 32a, 8h
JEQ r32, r33, :4
LI64 r32, 100d
CP r1, r32
JMP :1
4: ST r34, r254, 0a, 8h
ST r34, r254, 8a, 8h
ST r34, r254, 16a, 8h
ST r34, r254, 24a, 8h
ST r34, r254, 32a, 8h
ST r34, r254, 40a, 8h
CP r32, r33
7: LD r37, r254, 32a, 8h
JNE r32, r36, :5
JEQ r37, r33, :6
LI64 r32, 10d
CP r1, r32
JMP :1
6: CP r1, r33
JMP :1
5: MULI64 r37, r32, 16d
ADD64 r37, r35, r37
ST r33, r37, 0a, 8h
ST r33, r37, 8a, 8h
ADD64 r32, r32, r34
JMP :7
3: MULI64 r37, r32, 16d
ADD64 r37, r35, r37
JAL r31, r0, :new_stru
ST r1, r37, 0a, 16h
ADD64 r32, r32, r34
JMP :8
1: LD r31, r254, 48a, 56h
ADDI64 r254, r254, 104d
JALA r0, r31, 0a
new_stru:
ADDI64 r254, r254, -16d
ADDI64 r13, r254, 0d
ST r0, r254, 0a, 8h
ST r0, r254, 8a, 8h
LD r1, r13, 0a, 16h
ADDI64 r254, r254, 16d
JALA r0, r31, 0a
code size: 684
ret: 0
status: Ok(())

View file

@ -0,0 +1,29 @@
main:
ADDI64 r254, r254, -12d
LI8 r13, 255b
ST r13, r254, 0a, 1h
ST r0, r254, 1a, 1h
ST r0, r254, 2a, 1h
ST r13, r254, 3a, 1h
ST r0, r254, 4a, 4h
LD r13, r254, 4a, 4h
LI32 r14, 2w
ST r14, r254, 8a, 4h
LD r14, r254, 8a, 4h
LI64 r15, 2d
ANDI r14, r14, 4294967295d
JEQ r14, r15, :0
CP r1, r0
JMP :1
0: ANDI r13, r13, 4294967295d
JEQ r13, r0, :2
LI64 r13, 64d
CP r1, r13
JMP :1
2: LI64 r13, 512d
CP r1, r13
1: ADDI64 r254, r254, 12d
JALA r0, r31, 0a
code size: 235
ret: 512
status: Ok(())

View file

@ -0,0 +1,22 @@
main:
ADDI64 r254, r254, -16d
LI64 r13, 10d
ADDI64 r14, r254, 0d
ST r13, r254, 0a, 8h
LI64 r13, 20d
ST r13, r254, 8a, 8h
LI64 r13, 6d
LI64 r15, 5d
LI64 r16, 1d
CP r2, r16
CP r5, r15
CP r6, r13
LD r3, r14, 0a, 16h
ECA
CP r1, r0
ADDI64 r254, r254, 16d
JALA r0, r31, 0a
ev: Ecall
code size: 154
ret: 0
status: Ok(())

View file

@ -0,0 +1,20 @@
main:
ADDI64 r254, r254, -16d
ST r31, r254, 0a, 16h
JAL r31, r0, :some_enum
CP r32, r1
ANDI r32, r32, 255d
JNE r32, r0, :0
CP r1, r0
JMP :1
0: LI64 r32, 100d
CP r1, r32
1: LD r31, r254, 0a, 16h
ADDI64 r254, r254, 16d
JALA r0, r31, 0a
some_enum:
CP r1, r0
JALA r0, r31, 0a
code size: 128
ret: 0
status: Ok(())

View file

@ -0,0 +1,114 @@
continue_and_state_change:
CP r13, r2
CP r15, r0
LI64 r16, 3d
LI64 r14, 4d
LI64 r17, 2d
LI64 r18, 10d
6: JLTU r13, r18, :0
JMP :1
0: JNE r13, r17, :2
CP r13, r14
JMP :3
2: JNE r13, r16, :4
CP r13, r15
1: CP r1, r13
JMP :5
4: ADDI64 r13, r13, 1d
3: JMP :6
5: JALA r0, r31, 0a
infinite_loop:
ADDI64 r254, r254, -32d
ST r31, r254, 0a, 32h
LI64 r34, 1d
CP r33, r0
CP r32, r33
1: JNE r32, r34, :0
JMP :0
0: CP r2, r33
JAL r31, r0, :continue_and_state_change
CP r32, r1
JMP :1
LD r31, r254, 0a, 32h
ADDI64 r254, r254, 32d
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -40d
ST r31, r254, 0a, 40h
CP r2, r0
JAL r31, r0, :multiple_breaks
CP r32, r1
LI64 r33, 3d
JEQ r32, r33, :0
LI64 r32, 1d
CP r1, r32
JMP :1
0: LI64 r32, 4d
CP r2, r32
JAL r31, r0, :multiple_breaks
CP r34, r1
LI64 r35, 10d
JEQ r34, r35, :2
LI64 r32, 2d
CP r1, r32
JMP :1
2: CP r2, r0
JAL r31, r0, :state_change_in_break
CP r34, r1
JEQ r34, r0, :3
CP r1, r33
JMP :1
3: CP r2, r32
JAL r31, r0, :state_change_in_break
CP r34, r1
JEQ r34, r35, :4
CP r1, r32
JMP :1
4: CP r2, r35
JAL r31, r0, :continue_and_state_change
CP r32, r1
JEQ r32, r35, :5
LI64 r32, 5d
CP r1, r32
JMP :1
5: CP r2, r33
JAL r31, r0, :continue_and_state_change
CP r32, r1
JEQ r32, r0, :6
LI64 r32, 6d
CP r1, r32
JMP :1
6: JAL r31, r0, :infinite_loop
CP r1, r0
1: LD r31, r254, 0a, 40h
ADDI64 r254, r254, 40d
JALA r0, r31, 0a
multiple_breaks:
CP r13, r2
LI64 r14, 3d
LI64 r15, 10d
4: JLTU r13, r15, :0
JMP :1
0: ADDI64 r13, r13, 1d
JNE r13, r14, :2
1: CP r1, r13
JMP :3
2: JMP :4
3: JALA r0, r31, 0a
state_change_in_break:
CP r13, r2
LI64 r14, 3d
LI64 r15, 10d
4: JLTU r13, r15, :0
JMP :1
0: JNE r13, r14, :2
CP r13, r0
1: CP r1, r13
JMP :3
2: ADDI64 r13, r13, 1d
JMP :4
3: JALA r0, r31, 0a
timed out
code size: 667
ret: 10
status: Ok(())

View file

@ -0,0 +1,55 @@
check_platform:
ADDI64 r254, r254, -16d
ST r31, r254, 0a, 16h
JAL r31, r0, :x86_fb_ptr
CP r32, r1
CP r1, r32
LD r31, r254, 0a, 16h
ADDI64 r254, r254, 16d
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -56d
ST r31, r254, 0a, 56h
JAL r31, r0, :check_platform
CP r33, r0
LI64 r36, 30d
LI64 r37, 100d
CP r35, r33
CP r34, r33
CP r32, r33
5: JLTU r32, r36, :0
ADDI64 r34, r34, 1d
CP r2, r33
CP r3, r34
CP r4, r36
JAL r31, r0, :set_pixel
CP r32, r1
JEQ r32, r35, :1
CP r1, r33
JMP :2
1: JNE r34, r37, :3
CP r1, r35
JMP :2
3: CP r32, r33
JMP :4
0: ADDI64 r35, r35, 1d
ADDI64 r32, r32, 1d
4: JMP :5
2: LD r31, r254, 0a, 56h
ADDI64 r254, r254, 56d
JALA r0, r31, 0a
set_pixel:
CP r13, r2
CP r14, r3
CP r15, r4
MUL64 r14, r14, r15
ADD64 r13, r14, r13
CP r1, r13
JALA r0, r31, 0a
x86_fb_ptr:
LI64 r13, 100d
CP r1, r13
JALA r0, r31, 0a
code size: 329
ret: 3000
status: Ok(())

View file

@ -0,0 +1,7 @@
main:
LI32 r13, 3212836864w
CP r1, r13
JALA r0, r31, 0a
code size: 28
ret: 3212836864
status: Ok(())

View file

@ -0,0 +1,29 @@
add_one:
CP r13, r2
ADDI64 r13, r13, 1d
CP r1, r13
JALA r0, r31, 0a
add_two:
CP r13, r2
ADDI64 r13, r13, 2d
CP r1, r13
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -24d
ST r31, r254, 0a, 24h
LI64 r32, 10d
CP r2, r32
JAL r31, r0, :add_one
CP r32, r1
LI64 r33, 20d
CP r2, r33
JAL r31, r0, :add_two
CP r33, r1
ADD64 r32, r33, r32
CP r1, r32
LD r31, r254, 0a, 24h
ADDI64 r254, r254, 24d
JALA r0, r31, 0a
code size: 176
ret: 33
status: Ok(())

View file

@ -0,0 +1,38 @@
add:
CP r13, r2
CP r14, r3
ADD64 r13, r13, r14
CP r1, r13
JALA r0, r31, 0a
add:
CP r13, r2
CP r14, r3
ADD32 r13, r13, r14
CP r1, r13
JALA r0, r31, 0a
add:
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -32d
ST r31, r254, 0a, 32h
JAL r31, r0, :add
LI32 r32, 2w
CP r2, r32
CP r3, r32
JAL r31, r0, :add
CP r32, r1
LI64 r33, 3d
LI64 r34, 1d
CP r2, r34
CP r3, r33
JAL r31, r0, :add
CP r33, r1
ANDI r32, r32, 4294967295d
SUB64 r32, r32, r33
CP r1, r32
LD r31, r254, 0a, 32h
ADDI64 r254, r254, 32d
JALA r0, r31, 0a
code size: 209
ret: 0
status: Ok(())

View file

@ -0,0 +1,32 @@
main:
ADDI64 r254, r254, -8d
ST r31, r254, 0a, 8h
JAL r31, r0, :process
LD r31, r254, 0a, 8h
ADDI64 r254, r254, 8d
JALA r0, r31, 0a
opaque:
JALA r0, r31, 0a
process:
ADDI64 r254, r254, -48d
ST r31, r254, 16a, 32h
ADDI64 r33, r254, 0d
ST r0, r254, 0a, 1h
LI64 r32, 1000d
4: JGTU r32, r0, :0
JMP :1
0: CP r2, r33
JAL r31, r0, :opaque
LD r34, r254, 0a, 1h
ANDI r34, r34, 255d
JEQ r34, r0, :2
JMP :3
2: ADDI64 r32, r32, -1d
1: JMP :4
3: LD r31, r254, 16a, 32h
ADDI64 r254, r254, 48d
JALA r0, r31, 0a
timed out
code size: 248
ret: 0
status: Ok(())

View file

@ -0,0 +1,6 @@
test.hb:60:22: somehow this was not found
defer deinit(uint, &vec)
^
test.hb:60:21: expected argument vec to be of type ^[Struct0]{data: ^uint, len: uint, cap: uint}, got ^never
defer deinit(uint, &vec)
^

View file

@ -0,0 +1,19 @@
clobber:
LRA r13, r0, :var
ST r0, r13, 0a, 8h
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -24d
ST r31, r254, 0a, 24h
LRA r32, r0, :var
LI64 r33, 2d
ST r33, r32, 0a, 8h
JAL r31, r0, :clobber
LD r32, r32, 0a, 8h
CP r1, r32
LD r31, r254, 0a, 24h
ADDI64 r254, r254, 24d
JALA r0, r31, 0a
code size: 159
ret: 0
status: Ok(())

View file

@ -0,0 +1,23 @@
inb:
CP r1, r0
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -32d
ST r31, r254, 0a, 32h
LRA r32, r0, :ports
LD r33, r32, 0a, 1h
ANDI r33, r33, 255d
JNE r33, r0, :0
JMP :1
0: JAL r31, r0, :inb
CP r33, r1
CMPU r34, r33, r0
CMPUI r34, r34, 0d
NOT r34, r34
ST r34, r32, 0a, 1h
1: LD r31, r254, 0a, 32h
ADDI64 r254, r254, 32d
JALA r0, r31, 0a
code size: 164
ret: 0
status: Ok(())

View file

@ -0,0 +1,10 @@
main:
LRA r13, r0, :complex_global_var
LD r14, r13, 0a, 8h
ADDI64 r14, r14, 5d
ST r14, r13, 0a, 8h
CP r1, r14
JALA r0, r31, 0a
code size: 74
ret: 55
status: Ok(())

View file

@ -0,0 +1,6 @@
main:
CP r1, r0
JALA r0, r31, 0a
code size: 22
ret: 0
status: Ok(())

View file

@ -0,0 +1,21 @@
main:
ADDI64 r254, r254, -128d
ADDI64 r15, r254, 0d
LI8 r16, 69b
LI64 r17, 128d
CP r13, r0
2: LD r14, r254, 42a, 1h
JLTU r13, r17, :0
ANDI r13, r14, 255d
CP r1, r13
JMP :1
0: ADDI64 r14, r13, 1d
ADD64 r13, r15, r13
ST r16, r13, 0a, 1h
CP r13, r14
JMP :2
1: ADDI64 r254, r254, 128d
JALA r0, r31, 0a
code size: 141
ret: 69
status: Ok(())

View file

@ -0,0 +1,36 @@
fib:
ADDI64 r254, r254, -32d
ST r31, r254, 0a, 32h
CP r32, r2
LI64 r33, 1d
LI64 r34, 2d
JGTU r32, r34, :0
CP r1, r33
JMP :1
0: SUB64 r33, r32, r33
CP r2, r33
JAL r31, r0, :fib
CP r33, r1
SUB64 r32, r32, r34
CP r2, r32
JAL r31, r0, :fib
CP r32, r1
ADD64 r32, r32, r33
CP r1, r32
1: LD r31, r254, 0a, 32h
ADDI64 r254, r254, 32d
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -16d
ST r31, r254, 0a, 16h
LI64 r32, 10d
CP r2, r32
JAL r31, r0, :fib
CP r32, r1
CP r1, r32
LD r31, r254, 0a, 16h
ADDI64 r254, r254, 16d
JALA r0, r31, 0a
code size: 229
ret: 55
status: Ok(())

View file

@ -0,0 +1,9 @@
main:
CP r13, r0
0: ADDI64 r13, r13, 1d
JMP :0
JALA r0, r31, 0a
timed out
code size: 38
ret: 0
status: Ok(())

View file

@ -0,0 +1,21 @@
main:
LI64 r13, 8d
CP r2, r13
ECA
LI64 r14, 6d
LRA r13, r0, :gb
LD r13, r13, 0a, 8h
CMPU r13, r13, r0
CMPUI r13, r13, 0d
OR r13, r13, r0
ANDI r13, r13, 255d
JNE r13, r0, :0
CP r13, r14
JMP :1
0: LI64 r13, 1d
1: SUB64 r13, r13, r14
CP r1, r13
JALA r0, r31, 0a
code size: 131
ret: 0
status: Ok(())

View file

@ -0,0 +1,19 @@
main:
ADDI64 r254, r254, -72d
ADDI64 r13, r254, 24d
ST r0, r254, 24a, 8h
LI64 r14, 1d
ST r14, r254, 32a, 8h
LI64 r14, 2d
ST r14, r254, 40a, 8h
ADDI64 r14, r254, 0d
BMC r13, r14, 24h
ADDI64 r13, r254, 48d
BMC r14, r13, 24h
LD r13, r254, 48a, 8h
CP r1, r13
ADDI64 r254, r254, 72d
JALA r0, r31, 0a
code size: 159
ret: 0
status: Ok(())

View file

@ -0,0 +1,29 @@
main:
ADDI64 r254, r254, -16d
ST r31, r254, 0a, 16h
JAL r31, r0, :scalar_values
CP r32, r1
JEQ r32, r0, :0
LI64 r32, 1d
CP r1, r32
JMP :1
0: JAL r31, r0, :structs
CP r32, r1
JEQ r32, r0, :2
JAL r31, r0, :structs
CP r32, r1
CP r1, r32
JMP :1
2: CP r1, r0
1: LD r31, r254, 0a, 16h
ADDI64 r254, r254, 16d
JALA r0, r31, 0a
scalar_values:
CP r1, r0
JALA r0, r31, 0a
structs:
CP r1, r0
JALA r0, r31, 0a
code size: 164
ret: 0
status: Ok(())

View file

@ -0,0 +1,7 @@
main:
LI64 r13, 10d
CP r1, r13
JALA r0, r31, 0a
code size: 32
ret: 10
status: Ok(())

View file

@ -0,0 +1,104 @@
main:
ADDI64 r254, r254, -122d
ST r31, r254, 58a, 64h
ADDI64 r32, r254, 33d
ADDI64 r33, r254, 34d
ADDI64 r34, r254, 1d
ADDI64 r35, r254, 17d
ST r32, r254, 34a, 8h
LI64 r36, 100d
ADDI64 r37, r254, 0d
LI8 r38, 1b
ST r0, r254, 1a, 8h
ST r0, r254, 17a, 8h
ST r36, r254, 42a, 8h
ST r38, r254, 0a, 1h
ST r0, r254, 9a, 8h
ST r0, r254, 25a, 8h
ST r36, r254, 50a, 8h
ST r0, r254, 33a, 1h
CP r2, r33
LD r3, r35, 0a, 16h
LD r5, r34, 0a, 16h
LD r7, r37, 0a, 1h
JAL r31, r0, :put_filled_rect
LD r31, r254, 58a, 64h
ADDI64 r254, r254, 122d
JALA r0, r31, 0a
put_filled_rect:
ADDI64 r254, r254, -108d
CP r14, r2
ST r3, r254, 92a, 16h
ADDI64 r3, r254, 92d
CP r15, r3
ST r5, r254, 76a, 16h
ADDI64 r5, r254, 76d
CP r13, r5
ST r7, r254, 75a, 1h
ADDI64 r7, r254, 75d
CP r16, r7
ADDI64 r17, r254, 25d
LI8 r18, 5b
ST r18, r254, 25a, 1h
LD r19, r13, 0a, 8h
ST r19, r254, 26a, 4h
LI64 r20, 1d
ST r20, r254, 30a, 4h
ST r16, r254, 34a, 8h
LI64 r21, 25d
ADDI64 r22, r254, 50d
ST r18, r254, 50a, 1h
ST r19, r254, 51a, 4h
ST r20, r254, 55a, 4h
ST r16, r254, 59a, 8h
LI64 r23, 2d
LI64 r24, 8d
LD r25, r15, 8a, 8h
LD r13, r13, 8a, 8h
ADD64 r26, r13, r25
SUB64 r26, r26, r20
LD r27, r14, 8a, 8h
MUL64 r26, r27, r26
LD r14, r14, 0a, 8h
ADD64 r26, r14, r26
LD r28, r15, 0a, 8h
ADD64 r15, r28, r26
MUL64 r25, r27, r25
ADD64 r14, r14, r25
ADD64 r14, r28, r14
3: JGTU r13, r20, :0
JNE r13, r20, :1
ADDI64 r13, r254, 0d
ST r18, r254, 0a, 1h
ST r19, r254, 1a, 4h
ST r20, r254, 5a, 4h
ST r16, r254, 9a, 8h
ST r14, r254, 17a, 8h
CP r2, r24
CP r3, r23
CP r4, r13
CP r5, r21
ECA
JMP :1
1: JMP :2
0: ST r14, r254, 67a, 8h
CP r2, r24
CP r3, r23
CP r4, r22
CP r5, r21
ECA
ST r15, r254, 42a, 8h
CP r2, r24
CP r3, r23
CP r4, r17
CP r5, r21
ECA
SUB64 r13, r13, r23
SUB64 r15, r15, r27
ADD64 r14, r27, r14
JMP :3
2: ADDI64 r254, r254, 108d
JALA r0, r31, 0a
code size: 875
ret: 0
status: Ok(())

View file

View file

@ -0,0 +1,25 @@
main:
ADDI64 r254, r254, -48d
ST r31, r254, 16a, 32h
ADDI64 r32, r254, 0d
ADDI64 r33, r254, 8d
ST r0, r254, 0a, 8h
ST r0, r254, 8a, 8h
LI64 r34, 1024d
CP r2, r33
CP r3, r32
CP r4, r34
JAL r31, r0, :set
CP r32, r1
ANDI r32, r32, 4294967295d
CP r1, r32
LD r31, r254, 16a, 32h
ADDI64 r254, r254, 48d
JALA r0, r31, 0a
set:
CP r13, r4
CP r1, r13
JALA r0, r31, 0a
code size: 175
ret: 1024
status: Ok(())

View file

@ -0,0 +1,28 @@
integer_range:
CP r13, r2
CP r14, r3
LI64 r15, 4d
LI64 r16, 3d
CP r2, r16
CP r3, r15
ECA
CP r15, r1
SUB64 r14, r14, r13
ADDI64 r14, r14, 1d
DIRU64 r0, r14, r15, r14
ADD64 r13, r14, r13
CP r1, r13
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -16d
ST r31, r254, 0a, 16h
LI64 r32, 1000d
CP r2, r0
CP r3, r32
JAL r31, r0, :integer_range
LD r31, r254, 0a, 16h
ADDI64 r254, r254, 16d
JALA r0, r31, 0a
code size: 164
ret: 42
status: Ok(())

View file

@ -0,0 +1,16 @@
main:
ADDI64 r254, r254, -8d
LI64 r13, 10d
ST r13, r254, 0a, 8h
2: LD r13, r254, 0a, 8h
JNE r13, r0, :0
CP r1, r13
JMP :1
0: ADDI64 r13, r13, -1d
ST r13, r254, 0a, 8h
JMP :2
1: ADDI64 r254, r254, 8d
JALA r0, r31, 0a
code size: 119
ret: 0
status: Ok(())

View file

@ -0,0 +1,28 @@
fib:
CP r13, r2
LI64 r17, 1d
CP r15, r0
CP r16, r17
CP r14, r15
2: JNE r13, r15, :0
CP r1, r14
JMP :1
0: SUB64 r13, r13, r17
ADD64 r14, r16, r14
SWA r14, r16
JMP :2
1: JALA r0, r31, 0a
main:
ADDI64 r254, r254, -16d
ST r31, r254, 0a, 16h
LI64 r32, 10d
CP r2, r32
JAL r31, r0, :fib
CP r32, r1
CP r1, r32
LD r31, r254, 0a, 16h
ADDI64 r254, r254, 16d
JALA r0, r31, 0a
code size: 155
ret: 55
status: Ok(())

Some files were not shown because too many files have changed in this diff Show more