Compare commits

..

200 commits
trunk ... trunk

Author SHA1 Message Date
Jakub Doka 08fc9d6ab6
gc
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-25 21:55:26 +01:00
Jakub Doka abcb3434f8
removing unwraps that can cause panics when typechecking
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-25 21:54:32 +01:00
Jakub Doka 4ebf1c7996
fixing outdated error message
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-25 21:34:58 +01:00
Jakub Doka 3f11e19a91
properly handling generic method calls
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-25 21:29:30 +01:00
Jakub Doka 51bfbdf7ae
tests are now autogenerated from the readme, this makes test names DRY
I want to reduce friction of adding new tests as much as possible

Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-25 21:17:03 +01:00
Jakub Doka 5c8f7c9c79
i am not useless after all, the invalid store elimination removed
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-25 20:27:16 +01:00
Jakub Doka 3491814b4f
i am tired
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-22 21:50:03 +01:00
Jakub Doka ee434e6135
more forgotten stuff
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-22 21:43:53 +01:00
Jakub Doka 9afe191bca
fixed a missing feature
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-22 21:41:28 +01:00
Jakub Doka 8ededb8612
adding standard instruction logging utility
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-22 21:33:15 +01:00
Jakub Doka 9c4b84ce33
fixing the precedence regarding slice ranges
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-22 20:43:49 +01:00
Jakub Doka 5909837015
making sure sliced pointer is loaded
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-22 18:53:52 +01:00
Jakub Doka 9f67b22aa2
maybe now the bug is fixed
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-22 17:01:36 +01:00
Jakub Doka 939d0807fb
fixing slice slicing
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-22 16:18:44 +01:00
Jakub Doka 888b38ad4c
frgot to transition @nameof to slices
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-21 23:55:02 +01:00
Jakub Doka 5275a7e0fd
adding slices
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-21 23:44:33 +01:00
Jakub Doka 418fd0039e
making the pointered arrays work properly
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-21 17:36:42 +01:00
Jakub Doka d220823d78
fixing the bug in previous commit
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-21 17:14:33 +01:00
Jakub Doka af19f4e30d
making the identifiers accessible if they are captured
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-21 14:21:58 +01:00
Jakub Doka 1621d93e86
adding more stuff to the blog
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-20 22:54:46 +01:00
Jakub Doka 4b3b6af70e
adding habdler for tuples with known type
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-20 11:59:21 +01:00
Jakub Doka f59c0c1092
some syntax changes
mainly added the explicit type for declarations

Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-20 11:32:18 +01:00
Jakub Doka 8ad58ee6b6
changing the array sintax to be distinc from tuple
the arrays can be declared in a more natural way
type of the element can aso be inferred form the first element

Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-20 10:17:34 +01:00
Jakub Doka 6e8eb059f6
adding tuples
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-19 23:08:36 +01:00
Jakub Doka 969ea57e3f
optimizing the bitset used in register allocation
also fixing an enum bug

Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-19 19:43:30 +01:00
Jakub Doka cfd3eac0a8
making the instruction scheduling smarter
the instructions that are only depended by phis are pushed to the end of
the block, which usually saves copy instructions

Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-19 11:00:19 +01:00
Jakub Doka a8aba7e7c2
making the Call node less special
return value is now a separate node pinned to the call

Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-19 10:48:05 +01:00
Jakub Doka f05c61a99e
adding @ChildOf directive
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-18 00:06:57 +01:00
Jakub Doka e769fa8dba
removing error for needless @as temporarly
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-17 22:29:16 +01:00
Jakub Doka b3f858f64b
adding @error directive
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-17 21:35:47 +01:00
Jakub Doka 1584ec7563
adding @Any directive
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-17 21:11:02 +01:00
Jakub Doka 6085177982
fixed the unreachable functions deleting branches
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-17 20:53:13 +01:00
Jakub Doka 47014c6164
lifting the restriction for inlining to allow normal functions as well
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-17 19:32:30 +01:00
Jakub Doka 3702a99d03
fixing another incorrect file reporting
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-17 19:30:42 +01:00
Jakub Doka 248bdf003a
making the else branch have less priority
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-17 19:10:56 +01:00
Jakub Doka d3f3fe98e3
propagating unreachable for functions returning never type
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-17 19:08:53 +01:00
Jakub Doka 14cf5efaa5
handling comptime known match
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-17 19:01:01 +01:00
Jakub Doka 95496116b0
making @len work on strings
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-17 18:30:19 +01:00
Jakub Doka 86f7d70747
adding default values to struct fields and @kindof directive
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-17 17:51:14 +01:00
Jakub Doka 0516ce68f4
adding @nameof
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-17 16:46:43 +01:00
Jakub Doka 945e5c70f6
extractng Nodes from son.rs
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-17 00:18:44 +01:00
Jakub Doka ec9bb886f8
removing more repetative patterns
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-17 00:02:00 +01:00
Jakub Doka 127fdb3cc5
reducing repeating patters
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-16 23:47:31 +01:00
Jakub Doka e65c72e19f
properly type checking, null checks are fixed
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-16 23:32:08 +01:00
Jakub Doka 1ca9529302
fixing missing antidependencies
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-16 17:37:14 +01:00
Jakub Doka c0d957e70c
removing needless errors
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-16 15:05:19 +01:00
Jakub Doka b9b8233a53
typo
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-16 15:03:07 +01:00
Jakub Doka d2fa41039b
strengthening the error recovery
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-16 15:00:48 +01:00
Jakub Doka 9fe8d6bbff
support integer to float coersion in more places
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-16 14:38:41 +01:00
Jakub Doka 04680c8b7c
fixing an incredible edge case
this basically only happens if the Vc oscilates between 7 and 8 elemenst

Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-16 14:27:23 +01:00
Jakub Doka a1e692eac7
maybe fixed the float op fold
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-16 14:06:12 +01:00
Jakub Doka 8bf2d1a266
flag the function as inline after checking style
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-16 14:04:30 +01:00
Jakub Doka 1571938e9f
bools can now upcast to any integer
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-16 13:54:52 +01:00
Jakub Doka f7d5bccdd9
fixing @itf type inference
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-16 13:49:20 +01:00
Jakub Doka 07d4fe416a
forgot to add defer handling to unrolled loops
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-16 13:27:15 +01:00
Jakub Doka b2be007ef0
adding unrolled loops, struct indexing and @len directive
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-16 13:20:47 +01:00
Jakub Doka bfac81c807
fixing a bug with ITF selecting based of input instead of output type
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-16 10:35:11 +01:00
Jakub Doka ef36e21475
making the compiler emit FMA instructions
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-16 10:16:53 +01:00
Jakub Doka 8138d6664f
properly releasing the strongrefs now
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-15 23:15:07 +01:00
Jakub Doka 9f43e3bb92
refactoring some stuff and loosening a requrement on assert
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-15 20:49:37 +01:00
Jakub Doka 6fba7da782
more general tree walking algorightm was needed (probably)
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-15 20:35:44 +01:00
Jakub Doka 7837eeb90d
implementing the loop iteration optimization
the multiplication and addition to a pointer is replaced with simply
incremrnting the pointer it self

Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-15 19:37:37 +01:00
Jakub Doka 5a7a01ca02
adding the stack offset elision for return values as well
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-15 17:57:22 +01:00
Jakub Doka f9c47f86ad
fixing a glaring bug, where the elidded offset is also offset
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-15 17:42:20 +01:00
Jakub Doka 48a0c8d0b9
POC for removeing needless stack offset computes when only value is used
TBD: there are far more cases where this will apply

Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-15 17:17:41 +01:00
Jakub Doka 00f6729d31
supporting ascii literals
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-14 21:02:29 +01:00
Jakub Doka dc96c8b10a
the items accesed outside the nested scope no longer get duplicated
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-14 18:50:13 +01:00
Jakub Doka 91e35b72ee
Fixing the invalid code bricking the UI
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-14 15:33:11 +01:00
Jakub Doka 5aeeedbdce
fixing non pointer struct method receiver not counting as use
forgot to strip pointer

Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-14 15:19:02 +01:00
Jakub Doka fae75072f4
removing hardcoded html files and replacing them with markdown
the markdown gets transpiled on build and built files are then included
in the server executable

Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-14 13:17:58 +01:00
Jakub Doka 71ba2c2486
Dividing function into template and instance, rmoving cumbersome options
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-02 15:51:12 +01:00
Jakub Doka c5d5301b7b
Removing some clones and fixing parent scoping in case of globals
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-02 12:59:07 +01:00
Jakub Doka c553c3d9e9
Removing repetative code, adding ent slice to properly index modules
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-01 19:04:27 +01:00
Jakub Doka 9ce446b507
Adding the simplest version of unions
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-01 15:11:38 +01:00
Jakub Doka 3b4b30b2bd
Restructuring the compiler
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-12-01 14:01:44 +01:00
Jakub Doka cf672beb79
making ableos path resolver public
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-11-30 22:45:59 +01:00
Jakub Doka 3f6ebdd009
fixing phi moves (longer move cycles)
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-11-30 22:33:13 +01:00
Jakub Doka 19aca050ed
add new ableos path resolver, separate platform independent code
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-11-30 18:57:29 +01:00
Jakub Doka d368ac023b
making error fields public
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-11-30 16:49:42 +01:00
Jakub Doka 8ea6c5cfcc
fixing miscompilation of generic struct functions
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-11-30 16:30:58 +01:00
Jakub Doka e7cd2c0129
making the loader function customizable
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-11-30 15:44:51 +01:00
Jakub Doka e44d003e7f
completing the generic types example
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-11-30 13:38:44 +01:00
Jakub Doka a2ca8d98df
fixing a bug with floationg point comparison
Signed-off-by: Jakub Doka <jakub.doka2@gmail.com>
2024-11-30 12:40:39 +01:00
Jakub Doka a3355a59c0
update on incomplete example 2024-11-24 18:55:30 +01:00
Jakub Doka b63b832383
wut 2024-11-24 18:51:20 +01:00
Jakub Doka 116f045a5f
adding defer 2024-11-24 18:50:55 +01:00
Jakub Doka 784d552c1d
fixing scoping bug 2024-11-24 16:43:45 +01:00
Jakub Doka 24a3aed360
implementing struct method for non generic contexts 2024-11-24 16:17:57 +01:00
Jakub Doka 58ee5c0a56
in progress of adding methods 2024-11-24 14:47:38 +01:00
Jakub Doka 9dfb2eb606
fg 2024-11-24 11:26:38 +01:00
Jakub Doka 5df4fb8882
changing the gcm to not mutate nodes in recursive functions 2024-11-23 19:47:17 +01:00
Jakub Doka 86ca959ea3
removing needless data dependencies 2024-11-23 17:23:33 +01:00
Jakub Doka f353bd5882
passing down the inference of 'void' to statements 2024-11-23 15:33:28 +01:00
Jakub Doka cad0a828d0
updating tests 2024-11-23 15:28:27 +01:00
Jakub Doka fb119bc6eb
allowing compatison of types 2024-11-23 15:28:02 +01:00
Jakub Doka aa83ed2ec9
fixing the annoyance 2024-11-23 14:19:47 +01:00
Jakub Doka fb11c94af4
fixing some bugs and extending runtime of programs 2024-11-23 12:35:16 +01:00
Jakub Doka b030b1eeb7
fixing user posts not being displayed 2024-11-23 11:14:03 +01:00
Jakub Doka 4856533b22
making sure dependencies are formatted 2024-11-23 10:31:38 +01:00
Jakub Doka d8d039b67a
adding capability to run posts and displaing cumulative runs 2024-11-23 10:18:05 +01:00
Jakub Doka b760d9ef75
improving theming 2024-11-23 08:38:41 +01:00
Jakub Doka e587de1778
fixing some lighthouse issues and bad coloring of icons 2024-11-23 00:59:58 +01:00
Jakub Doka b12579ff65
adding import count to posts 2024-11-23 00:06:25 +01:00
Jakub Doka 0aa355695a
fixing the buggg, finally 2024-11-22 21:01:45 +01:00
Jakub Doka 4a857d2317
garbage 2024-11-22 19:50:59 +01:00
Jakub Doka 2253ac6198
fixing some bugs 2024-11-22 19:50:36 +01:00
Jakub Doka 05a7bd0583
chaning css for no reason 2024-11-18 13:10:44 +01:00
able ab55ec0240 Merge pull request 'point the profile button at a users fully qualified profile page' (#24) from able-patch-1 into trunk
Reviewed-on: https://git.ablecorp.us/AbleOS/holey-bytes/pulls/24
2024-11-18 06:09:19 -06:00
able 8353ab58a5 point the profile button at a users fully qualified profile page
Signed-off-by: able <abl3theabove@gmail.com>
2024-11-18 06:08:46 -06:00
Jakub Doka f527d61c1e
limitting request body size 2024-11-18 11:28:47 +01:00
Jakub Doka f3879cb013
fixing js errors, adding password change form 2024-11-18 10:31:30 +01:00
Jakub Doka e89511b14c
fixing position reporting for optimized returns 2024-11-17 22:26:31 +01:00
Jakub Doka 1c135a3050
adding interesting asert 2024-11-17 21:43:02 +01:00
Jakub Doka f83194359c
fixed yet another off by 1 2024-11-17 21:35:41 +01:00
Jakub Doka becd5c4b7f
well? 2024-11-17 21:30:20 +01:00
Jakub Doka 37dd13cab2
brah 2024-11-17 21:27:30 +01:00
Jakub Doka bc2dd82eb7
welp 2024-11-17 21:25:22 +01:00
Jakub Doka aa2de502cc
saving 2024-11-17 21:09:36 +01:00
Jakub Doka 542c69fd60
changing case checking to a warning 2024-11-17 20:57:10 +01:00
Jakub Doka 95e9270fef
adding case checking 2024-11-17 20:04:53 +01:00
Jakub Doka fe5a8631f6
fixed a bug of not marking idents as used 2024-11-17 18:44:24 +01:00
Jakub Doka 8892dd729a
fixing the false return location 2024-11-17 18:15:58 +01:00
Jakub Doka a7718e1220
cleaning up some code 2024-11-17 17:14:44 +01:00
Jakub Doka e079bbd312
forgot to support explicit enum type 2024-11-17 16:30:59 +01:00
Jakub Doka 12b9d43754
adding minimal enums 2024-11-17 16:25:39 +01:00
Jakub Doka 397b2a4b1b
fixed a stack prelude postlude being needlesly generated + struct can now be compared 2024-11-17 10:06:10 +01:00
Jakub Doka 12bb7029b4
fixed float comparison in the vm 2024-11-16 21:38:10 +01:00
Jakub Doka a64383e72b
saving 2024-11-16 20:52:38 +01:00
Igor null 2034152c83 added syntax highlighting to depell 2024-11-16 13:48:31 -06:00
Jakub Doka 13714eb513
removing stable feature supression 2024-11-16 17:41:40 +01:00
Jakub Doka 4088bd18b1
fixing compilation error for ableos 2024-11-16 17:29:30 +01:00
Jakub Doka e94b812b3b
removing the regalloc dependency 2024-11-16 14:22:34 +01:00
Jakub Doka e5d6b35f66
removing needless copies to zero register for unused values 2024-11-16 13:42:17 +01:00
Jakub Doka e6df9b6b01
cleanup 2024-11-16 11:46:59 +01:00
Jakub Doka baa70d3f12
removing needless copy into ret register 2024-11-16 10:16:35 +01:00
Jakub Doka ec4499e519
removing duplicate un instruction 2024-11-16 10:05:56 +01:00
Jakub Doka 085c593add
removing needless truncation of zeroth register 2024-11-16 10:03:08 +01:00
Jakub Doka 867a750d8f
hopefully this is less of a mess now 2024-11-15 23:18:40 +01:00
Jakub Doka b1b6d9eba1
fix the inference on itf 2024-11-15 22:53:22 +01:00
Jakub Doka 12be64965f
maybe fixed mandelbrot 2024-11-15 22:35:03 +01:00
Jakub Doka 7058efe75c
reverting phi transformation 2024-11-15 19:32:04 +01:00
Jakub Doka afc1c5aac5
orginizing null checks better to get more peephole hits 2024-11-15 14:36:33 +01:00
Jakub Doka 83146cfd61
adding a simple peephole for phis 2024-11-15 13:06:03 +01:00
Jakub Doka bb625a9e19
some cleanump and ironing out bugs in new regalloc 2024-11-15 12:04:05 +01:00
Jakub Doka 81cf39b602
adding more info for assert 2024-11-14 21:57:51 +01:00
Jakub Doka e4da9cc927
adding regalloc, fixing needless semicolon insertion 2024-11-14 21:50:10 +01:00
Jakub Doka 454b0ffd1c
adding regalloc option 2024-11-14 21:34:31 +01:00
Jakub Doka 981c17ff19
fixing function destinations 2024-11-14 20:25:52 +01:00
Jakub Doka d01e31b203
fixing stack return values 2024-11-13 16:18:21 +01:00
Jakub Doka 9cb273a04b
edge case of returning stack from inlined function 2024-11-13 15:56:37 +01:00
Jakub Doka 2e2b7612d9
some cleanup 2024-11-13 15:45:45 +01:00
Jakub Doka f493c2776f
forgot to fix return 2024-11-13 15:27:35 +01:00
Jakub Doka f77bc52465
fixing unchanged parsed file 2024-11-13 15:25:27 +01:00
Jakub Doka f524013c34
making use of zero register 2024-11-13 10:28:16 +01:00
Jakub Doka 3c86eafe72
fixing another problem with rescheduling 2024-11-13 08:49:25 +01:00
Jakub Doka 0d87bf8f09
removing browser from nontest envs 2024-11-12 22:30:10 +01:00
Jakub Doka e5a4561f07
very funny fix 2024-11-12 21:54:23 +01:00
Jakub Doka b71031c146
prolly fix 2024-11-12 21:12:57 +01:00
Jakub Doka dd51961fbb
adding assert for better error 2024-11-12 21:10:42 +01:00
Jakub Doka 63f2a0dac0
well... 2024-11-12 20:59:12 +01:00
Jakub Doka 4ec88e3397
adding pointer edgecase 2024-11-12 20:42:04 +01:00
Jakub Doka f1e715e9bd
refactoring truncation 2024-11-12 19:02:29 +01:00
Jakub Doka 80fd0e89b4
fixing the inline flag delegation with generic functions 2024-11-12 17:32:20 +01:00
Jakub Doka 9949086011
allowing eca in inline functions 2024-11-12 17:11:39 +01:00
Jakub Doka c701eb7b6d
adding extra test 2024-11-12 12:54:36 +01:00
Jakub Doka f1deab11c9
making better peepholes and fixing overoptimization on memory swaps 2024-11-12 12:20:08 +01:00
Jakub Doka f079daa42d
removing redundant loop phys 2024-11-11 23:33:36 +01:00
Jakub Doka 7cac9382ad
we assumed unary operands are at leas 4bytes bit 2024-11-11 23:17:13 +01:00
Jakub Doka ce2f7d2059
fixing negation truncation 2024-11-11 23:02:02 +01:00
Jakub Doka f5f9060803
adding missing instruction selection 2024-11-11 22:36:20 +01:00
Jakub Doka ad7fb5d0fc
adding errors for useless type hints 2024-11-11 22:34:42 +01:00
Jakub Doka d99672b751
fixing too strict assert 2024-11-11 22:14:54 +01:00
Jakub Doka 7def052749
preventing dangling nodes due to cycles in loop phys 2024-11-11 21:55:18 +01:00
Jakub Doka b2eefa5b83
removing assert that can cause crashes 2024-11-11 09:07:36 +01:00
Jakub Doka 3c35557872
fixing type variables in loops 2024-11-11 09:06:34 +01:00
Jakub Doka b6274f3455
fixing yet another edge case 2024-11-10 20:30:35 +01:00
Jakub Doka c61efc3933
adding inline functions 2024-11-10 19:35:48 +01:00
Jakub Doka 654005eea2
updating tests 2024-11-10 18:59:29 +01:00
Jakub Doka 335e6ec20a
fixing nasty aclass clobber priority bug 2024-11-10 18:56:33 +01:00
Jakub Doka 1e02efc1eb
improving load analisys 2024-11-10 17:32:24 +01:00
Jakub Doka 8b98c2ed1b
fixing different file imports 2024-11-10 12:26:30 +01:00
Jakub Doka c353d28be0
fixing another problem with const 2024-11-10 12:03:15 +01:00
Jakub Doka 7865d692a1
fixing the rescheduling edgecase 2024-11-10 11:04:04 +01:00
Jakub Doka 29a23cec0c
removing dbg 2024-11-10 10:30:45 +01:00
Jakub Doka 5dce4df2a1
fixing more stuff 2024-11-10 10:28:02 +01:00
Jakub Doka 42a713aeae
fixing wrong instruction selection 2024-11-10 09:17:43 +01:00
Jakub Doka 823c78bf74
preventing deduplication to cause bugs 2024-11-09 15:14:03 +01:00
Jakub Doka c657084451
integer constants can be casted to floats if type is known to be a float 2024-11-09 14:02:13 +01:00
Jakub Doka 63a1c7feb4
fixing float conversion constant folding 2024-11-09 13:54:08 +01:00
Jakub Doka bedffa9b32
fixing constant fmt newline preservation 2024-11-09 10:58:57 +01:00
Jakub Doka b8032aa840
wrong index for extend 2024-11-09 10:28:53 +01:00
Jakub Doka 65e9f272a8
forgotten dbgs 2024-11-08 23:03:16 +01:00
Jakub Doka d2052cd2a3
adding back the exit code 2024-11-08 21:53:24 +01:00
Jakub Doka 29367d8f8b
fixing compiler pulling function destinations out of the loops 2024-11-08 20:40:18 +01:00
Jakub Doka a299bad75b
adding some simple provenance checks on return values 2024-11-08 11:51:10 +01:00
Jakub Doka 7d48d3beb1
adding constants 2024-11-08 10:57:58 +01:00
Jakub Doka 68c0248189
making type manipulation nicer 2024-11-08 10:25:34 +01:00
Jakub Doka 0ef74d89cb
putting logger back 2024-11-08 08:40:14 +01:00
Jakub Doka 1b2b9f899d
ups 2024-11-08 08:36:13 +01:00
Jakub Doka 455f70db6e
adding better error reporting when compiler crashes errors are now sent trough out buffer 2024-11-08 08:36:00 +01:00
140 changed files with 12756 additions and 8086 deletions

2
.gitignore vendored
View file

@ -9,5 +9,7 @@ db.sqlite-journal
# assets # assets
/depell/src/*.gz /depell/src/*.gz
/depell/src/*.wasm /depell/src/*.wasm
/depell/src/static-pages/*.html
#**/*-sv.rs #**/*-sv.rs
/bytecode/src/instrs.rs /bytecode/src/instrs.rs
/lang/src/testcases.rs

108
Cargo.lock generated
View file

@ -38,17 +38,11 @@ dependencies = [
"memchr", "memchr",
] ]
[[package]]
name = "allocator-api2"
version = "0.2.20"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "45862d1c77f2228b9e10bc609d5bc203d86ebc9b87ad8d5d5167a6c9abf739d9"
[[package]] [[package]]
name = "anyhow" name = "anyhow"
version = "1.0.93" version = "1.0.89"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4c95c10ba0b00a02636238b814946408b1322d5ac4760326e6fb8ec956d85775" checksum = "86fdf8605db99b54d3cd748a44c6d04df638eb5dafb219b135d0149bd0db01f6"
[[package]] [[package]]
name = "arc-swap" name = "arc-swap"
@ -235,7 +229,7 @@ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"regex", "regex",
"rustc-hash 1.1.0", "rustc-hash",
"shlex", "shlex",
"syn", "syn",
"which", "which",
@ -265,15 +259,6 @@ dependencies = [
"generic-array", "generic-array",
] ]
[[package]]
name = "bumpalo"
version = "3.16.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "79296716171880943b8470b5f8d03aa55eb2e645a4874bdbb28adb49162e012c"
dependencies = [
"allocator-api2",
]
[[package]] [[package]]
name = "bytes" name = "bytes"
version = "1.8.0" version = "1.8.0"
@ -282,9 +267,9 @@ checksum = "9ac0150caa2ae65ca5bd83f25c7de183dea78d4d366469f148435e2acfbad0da"
[[package]] [[package]]
name = "cc" name = "cc"
version = "1.2.0" version = "1.1.36"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1aeb932158bd710538c73702db6945cb68a8fb08c519e6e12706b94263b36db8" checksum = "baee610e9452a8f6f0a1b6194ec09ff9e2d85dea54432acdae41aa0761c95d70"
dependencies = [ dependencies = [
"jobserver", "jobserver",
"libc", "libc",
@ -348,9 +333,9 @@ dependencies = [
[[package]] [[package]]
name = "cpufeatures" name = "cpufeatures"
version = "0.2.15" version = "0.2.14"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0ca741a962e1b0bff6d724a1a0958b686406e853bb14061f218562e1896f95e6" checksum = "608697df725056feaccfa42cffdaeeec3fccc4ffc38358ecd19b243e716a78e0"
dependencies = [ dependencies = [
"libc", "libc",
] ]
@ -591,12 +576,9 @@ dependencies = [
[[package]] [[package]]
name = "hashbrown" name = "hashbrown"
version = "0.15.1" version = "0.15.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3a9bfc1af68b1726ea47d3d5109de126281def866b33970e10fbab11b5dafab3" checksum = "1e087f84d4f86bf4b218b927129862374b72199ae7d8657835f1e89000eea4fb"
dependencies = [
"allocator-api2",
]
[[package]] [[package]]
name = "hashlink" name = "hashlink"
@ -615,11 +597,10 @@ version = "0.1.0"
name = "hblang" name = "hblang"
version = "0.1.0" version = "0.1.0"
dependencies = [ dependencies = [
"hashbrown 0.15.1", "hashbrown 0.15.0",
"hbbytecode", "hbbytecode",
"hbvm", "hbvm",
"log", "log",
"regalloc2",
] ]
[[package]] [[package]]
@ -778,7 +759,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "707907fe3c25f5424cce2cb7e1cbcafee6bdbe735ca90ef77c29e84591e5b9da" checksum = "707907fe3c25f5424cce2cb7e1cbcafee6bdbe735ca90ef77c29e84591e5b9da"
dependencies = [ dependencies = [
"equivalent", "equivalent",
"hashbrown 0.15.1", "hashbrown 0.15.0",
"serde", "serde",
] ]
@ -826,9 +807,9 @@ checksum = "884e2677b40cc8c339eaefcb701c32ef1fd2493d71118dc0ca4b6a736c93bd67"
[[package]] [[package]]
name = "libc" name = "libc"
version = "0.2.162" version = "0.2.159"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "18d287de67fe55fd7e1581fe933d965a5a9477b38e949cfa9f8574ef01506398" checksum = "561d97a539a36e26a9a5fad1ea11a3039a67714694aaa379433e580854bc3dc5"
[[package]] [[package]]
name = "libloading" name = "libloading"
@ -863,6 +844,15 @@ version = "0.4.22"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a7a70ba024b9dc04c27ea2f0c0548feb474ec5c54bba33a7f72f873a39d07b24" checksum = "a7a70ba024b9dc04c27ea2f0c0548feb474ec5c54bba33a7f72f873a39d07b24"
[[package]]
name = "markdown"
version = "1.0.0-alpha.21"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a6491e6c702bf7e3b24e769d800746d5f2c06a6c6a2db7992612e0f429029e81"
dependencies = [
"unicode-id",
]
[[package]] [[package]]
name = "matchit" name = "matchit"
version = "0.7.3" version = "0.7.3"
@ -1023,9 +1013,9 @@ checksum = "439ee305def115ba05938db6eb1644ff94165c5ab5e9420d1c1bcedbba909391"
[[package]] [[package]]
name = "prettyplease" name = "prettyplease"
version = "0.2.25" version = "0.2.22"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "64d1ec885c64d0457d564db4ec299b2dae3f9c02808b8ad9c3a089c591b18033" checksum = "479cf940fbbb3426c32c5d5176f62ad57549a0bb84773423ba8be9d089f5faba"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"syn", "syn",
@ -1033,9 +1023,9 @@ dependencies = [
[[package]] [[package]]
name = "proc-macro2" name = "proc-macro2"
version = "1.0.89" version = "1.0.87"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f139b0662de085916d1fb67d2b4169d1addddda1919e696f3252b740b629986e" checksum = "b3e4daa0dcf6feba26f985457cdf104d4b4256fc5a09547140f3631bb076b19a"
dependencies = [ dependencies = [
"unicode-ident", "unicode-ident",
] ]
@ -1058,19 +1048,6 @@ dependencies = [
"getrandom", "getrandom",
] ]
[[package]]
name = "regalloc2"
version = "0.10.2"
source = "git+https://github.com/jakubDoka/regalloc2?branch=reuse-allocations#21c43e3ee182824e92e2b25f1d3c03ed47f9c02b"
dependencies = [
"allocator-api2",
"bumpalo",
"hashbrown 0.14.5",
"log",
"rustc-hash 2.0.0",
"smallvec",
]
[[package]] [[package]]
name = "regex" name = "regex"
version = "1.11.1" version = "1.11.1"
@ -1085,9 +1062,9 @@ dependencies = [
[[package]] [[package]]
name = "regex-automata" name = "regex-automata"
version = "0.4.9" version = "0.4.8"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "809e8dc61f6de73b46c85f4c96486310fe304c434cfa43669d7b40f711150908" checksum = "368758f23274712b504848e9d5a6f010445cc8b87a7cdb4d7cbee666c1288da3"
dependencies = [ dependencies = [
"aho-corasick", "aho-corasick",
"memchr", "memchr",
@ -1141,17 +1118,11 @@ version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "08d43f7aa6b08d49f382cde6a7982047c3426db949b1424bc4b7ec9ae12c6ce2" checksum = "08d43f7aa6b08d49f382cde6a7982047c3426db949b1424bc4b7ec9ae12c6ce2"
[[package]]
name = "rustc-hash"
version = "2.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "583034fd73374156e66797ed8e5b0d5690409c9226b22d87cb7f19821c05d152"
[[package]] [[package]]
name = "rustix" name = "rustix"
version = "0.38.40" version = "0.38.37"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "99e4ea3e1cdc4b559b8e5650f9c8e5998e3e5c1343b4eaf034565f32318d63c0" checksum = "8acb788b847c24f28525660c4d7758620a7210875711f79e7f663cc152726811"
dependencies = [ dependencies = [
"bitflags", "bitflags",
"errno", "errno",
@ -1221,18 +1192,18 @@ checksum = "61697e0a1c7e512e84a621326239844a24d8207b4669b41bc18b32ea5cbf988b"
[[package]] [[package]]
name = "serde" name = "serde"
version = "1.0.215" version = "1.0.210"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6513c1ad0b11a9376da888e3e0baa0077f1aed55c17f50e7b2397136129fb88f" checksum = "c8e3592472072e6e22e0a54d5904d9febf8508f65fb8552499a1abc7d1078c3a"
dependencies = [ dependencies = [
"serde_derive", "serde_derive",
] ]
[[package]] [[package]]
name = "serde_derive" name = "serde_derive"
version = "1.0.215" version = "1.0.210"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ad1e866f866923f252f05c889987993144fb74e722403468a4ebd70c3cd756c0" checksum = "243902eda00fad750862fc144cea25caca5e20d615af0a81bee94ca738f1df1f"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
@ -1324,9 +1295,9 @@ checksum = "13c2bddecc57b384dee18652358fb23172facb8a2c51ccc10d74c157bdea3292"
[[package]] [[package]]
name = "syn" name = "syn"
version = "2.0.87" version = "2.0.79"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "25aa4ce346d03a6dcd68dd8b4010bcb74e54e62c90c573f394c46eae99aba32d" checksum = "89132cd0bf050864e1d38dc3bbc07a0eb8e7530af26344d3d2bbbef83499f590"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
@ -1484,6 +1455,12 @@ version = "1.17.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "42ff0bf0c66b8238c6f3b578df37d0b7848e55df8577b3f74f92a69acceeb825" checksum = "42ff0bf0c66b8238c6f3b578df37d0b7848e55df8577b3f74f92a69acceeb825"
[[package]]
name = "unicode-id"
version = "0.3.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "10103c57044730945224467c09f71a4db0071c123a0648cc3e818913bde6b561"
[[package]] [[package]]
name = "unicode-ident" name = "unicode-ident"
version = "1.0.13" version = "1.0.13"
@ -1686,6 +1663,7 @@ name = "xtask"
version = "0.1.0" version = "0.1.0"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"markdown",
"walrus", "walrus",
] ]

View file

@ -17,9 +17,7 @@ members = [
[workspace.dependencies] [workspace.dependencies]
hbbytecode = { path = "bytecode", default-features = false } hbbytecode = { path = "bytecode", default-features = false }
hbvm = { path = "vm", default-features = false } hbvm = { path = "vm", default-features = false }
hbxrt = { path = "xrt" }
hblang = { path = "lang", default-features = false } hblang = { path = "lang", default-features = false }
hbjit = { path = "jit" }
[profile.release] [profile.release]
lto = true lto = true

View file

@ -5,6 +5,6 @@ edition = "2018"
[features] [features]
default = ["disasm"] default = ["disasm"]
std = [] disasm = ["alloc"]
disasm = ["std"] alloc = []

View file

@ -98,6 +98,27 @@ fn gen_instrs(generated: &mut String) -> Result<(), Box<dyn std::error::Error>>
writeln!(generated, " {name} = {id},")?; writeln!(generated, " {name} = {id},")?;
} }
writeln!(generated, "}}")?; writeln!(generated, "}}")?;
writeln!(generated, "impl {instr} {{")?;
writeln!(generated, " pub fn size(self) -> usize {{")?;
writeln!(generated, " match self {{")?;
let mut instrs = instructions().collect::<Vec<_>>();
instrs.sort_unstable_by_key(|&[.., ty, _]| iter_args(ty).map(arg_to_width).sum::<usize>());
for group in instrs.chunk_by(|[.., a, _], [.., b, _]| {
iter_args(a).map(arg_to_width).sum::<usize>()
== iter_args(b).map(arg_to_width).sum::<usize>()
}) {
let ty = group[0][2];
for &[_, name, ..] in group {
writeln!(generated, " | {instr}::{name}")?;
}
generated.pop();
let size = iter_args(ty).map(arg_to_width).sum::<usize>() + 1;
writeln!(generated, " => {size},")?;
}
writeln!(generated, " }}")?;
writeln!(generated, " }}")?;
writeln!(generated, "}}")?;
} }
'_arg_kind: { '_arg_kind: {

View file

@ -254,8 +254,7 @@ pub fn disasm<'a>(
|| global_offset > off + len || global_offset > off + len
|| prev || prev
.get(global_offset as usize) .get(global_offset as usize)
.map_or(true, |&b| instr_from_byte(b).is_err()) .is_none_or(|&b| instr_from_byte(b).is_err());
|| prev[global_offset as usize] == 0;
has_oob |= local_has_oob; has_oob |= local_has_oob;
let label = labels.get(&global_offset).unwrap(); let label = labels.get(&global_offset).unwrap();
if local_has_oob { if local_has_oob {

View file

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" height="24px" viewBox="0 -960 960 960" width="24px" fill="#e8eaed"><path d="M480-320 280-520l56-58 104 104v-326h80v326l104-104 56 58-200 200ZM240-160q-33 0-56.5-23.5T160-240v-120h80v120h480v-120h80v120q0 33-23.5 56.5T720-160H240Z"/></svg>

After

Width:  |  Height:  |  Size: 279 B

1
depell/src/icons/run.svg Normal file
View file

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" height="24px" viewBox="0 -960 960 960" width="24px" fill="#e8eaed"><path d="M320-200v-560l440 280-440 280Zm80-280Zm0 134 210-134-210-134v268Z"/></svg>

After

Width:  |  Height:  |  Size: 190 B

View file

@ -1,16 +1,15 @@
* { * {
font-family: var(--font); font-family: var(--font);
line-height: 1.3;
} }
body { body {
--primary: white; --primary: light-dark(white, #181A1B);
--secondary: #EFEFEF; --secondary: light-dark(#EFEFEF, #212425);
--timestamp: #777777; --timestamp: light-dark(#555555, #AAAAAA);
--error: #ff3333; --error: #ff3333;
--placeholder: #333333;
} }
body { body {
--small-gap: 5px; --small-gap: 5px;
--font: system-ui, -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Open Sans', 'Helvetica Neue', sans-serif; --font: system-ui, -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Open Sans', 'Helvetica Neue', sans-serif;
@ -36,6 +35,11 @@ body {
} }
div.preview { div.preview {
margin: var(--small-gap) 0px;
display: flex;
flex-direction: column;
gap: var(--small-gap);
div.info { div.info {
display: flex; display: flex;
gap: var(--small-gap); gap: var(--small-gap);
@ -45,12 +49,34 @@ div.preview {
} }
} }
div.stats { div.stat {
display: flex; display: flex;
gap: var(--small-gap);
svg {
height: 18px;
} }
} }
div.code {
position: relative;
nav {
position: absolute;
right: 0;
padding: var(--small-gap);
button {
display: flex;
padding: 0;
}
}
}
}
svg {
fill: black;
}
form { form {
display: flex; display: flex;
flex-direction: column; flex-direction: column;
@ -83,6 +109,8 @@ pre {
font-family: var(--monospace); font-family: var(--monospace);
tab-size: 4; tab-size: 4;
overflow-x: auto; overflow-x: auto;
white-space: pre-wrap;
word-wrap: break-word;
} }
input { input {
@ -108,6 +136,11 @@ button:hover:not(:active) {
background: var(--primary); background: var(--primary);
} }
code {
font-family: var(--monospace);
line-height: 1;
}
div#code-editor { div#code-editor {
display: flex; display: flex;
position: relative; position: relative;
@ -141,3 +174,40 @@ div#dep-list {
} }
} }
} }
.syn {
font-family: var(--monospace);
&.Comment {
color: #939f91;
}
&.Keyword {
color: #f85552;
}
&.Identifier,
&.Directive {
color: #3a94c5;
}
/* &.Number {} */
&.String {
color: #8da101;
}
&.Op,
&.Assign {
color: #f57d26;
}
&.Paren,
&.Bracket,
&.Comma,
&.Dot,
&.Ctor,
&.Colon {
color: light-dark(#5c6a72, #999999);
}
}

View file

@ -14,7 +14,7 @@ const stack_pointer_offset = 1 << 20;
/** @param {WebAssembly.Instance} instance @param {Post[]} packages @param {number} fuel /** @param {WebAssembly.Instance} instance @param {Post[]} packages @param {number} fuel
* @returns {string} */ * @returns {string} */
function compileCode(instance, packages, fuel) { function compileCode(instance, packages, fuel = 100) {
let { let {
INPUT, INPUT_LEN, INPUT, INPUT_LEN,
LOG_MESSAGES, LOG_MESSAGES_LEN, LOG_MESSAGES, LOG_MESSAGES_LEN,
@ -34,7 +34,7 @@ function compileCode(instance, packages, fuel) {
new DataView(memory.buffer).setUint32(INPUT_LEN.value, codeLength, true); new DataView(memory.buffer).setUint32(INPUT_LEN.value, codeLength, true);
runWasmFunction(instance, compile_and_run, fuel); runWasmFunction(instance, compile_and_run, fuel);
return bufToString(memory, LOG_MESSAGES, LOG_MESSAGES_LEN); return bufToString(memory, LOG_MESSAGES, LOG_MESSAGES_LEN).trim();
} }
/**@type{WebAssembly.Instance}*/ let fmtInstance; /**@type{WebAssembly.Instance}*/ let fmtInstance;
@ -44,23 +44,25 @@ async function getFmtInstance() {
return fmtInstance ??= (await fmtInstaceFuture).instance; return fmtInstance ??= (await fmtInstaceFuture).instance;
} }
/** @param {WebAssembly.Instance} instance @param {string} code @param {"fmt" | "minify"} action /** @param {WebAssembly.Instance} instance @param {string} code @param {"tok" | "fmt" | "minify"} action
* @returns {string | undefined} */ * @returns {string | Uint8Array | undefined} */
function modifyCode(instance, code, action) { function modifyCode(instance, code, action) {
let { let {
INPUT, INPUT_LEN, INPUT, INPUT_LEN,
OUTPUT, OUTPUT_LEN, OUTPUT, OUTPUT_LEN,
memory, fmt, minify memory, fmt, tok, minify
} = instance.exports; } = instance.exports;
let funs = { fmt, tok, minify };
let fun = funs[action];
if (!(true if (!(true
&& memory instanceof WebAssembly.Memory && memory instanceof WebAssembly.Memory
&& INPUT instanceof WebAssembly.Global && INPUT instanceof WebAssembly.Global
&& INPUT_LEN instanceof WebAssembly.Global && INPUT_LEN instanceof WebAssembly.Global
&& OUTPUT instanceof WebAssembly.Global && OUTPUT instanceof WebAssembly.Global
&& OUTPUT_LEN instanceof WebAssembly.Global && OUTPUT_LEN instanceof WebAssembly.Global
&& typeof fmt === "function" && funs.hasOwnProperty(action)
&& typeof minify === "function" && typeof fun === "function"
)) never(); )) never();
if (action !== "fmt") { if (action !== "fmt") {
@ -72,8 +74,14 @@ function modifyCode(instance, code, action) {
dw.setUint32(INPUT_LEN.value, code.length, true); dw.setUint32(INPUT_LEN.value, code.length, true);
new Uint8Array(memory.buffer, INPUT.value).set(new TextEncoder().encode(code)); new Uint8Array(memory.buffer, INPUT.value).set(new TextEncoder().encode(code));
return runWasmFunction(instance, action === "fmt" ? fmt : minify) ? if (!runWasmFunction(instance, fun)) {
bufToString(memory, OUTPUT, OUTPUT_LEN) : undefined; return undefined;
}
if (action === "tok") {
return bufSlice(memory, OUTPUT, OUTPUT_LEN);
} else {
return bufToString(memory, OUTPUT, OUTPUT_LEN);
}
} }
@ -119,6 +127,15 @@ function packPosts(posts, view) {
return len; return len;
} }
/** @param {WebAssembly.Memory} mem
* @param {WebAssembly.Global} ptr
* @param {WebAssembly.Global} len
* @return {Uint8Array} */
function bufSlice(mem, ptr, len) {
return new Uint8Array(mem.buffer, ptr.value,
new DataView(mem.buffer).getUint32(len.value, true));
}
/** @param {WebAssembly.Memory} mem /** @param {WebAssembly.Memory} mem
* @param {WebAssembly.Global} ptr * @param {WebAssembly.Global} ptr
* @param {WebAssembly.Global} len * @param {WebAssembly.Global} len
@ -141,12 +158,13 @@ function wireUp(target) {
const importRe = /@use\s*\(\s*"(([^"]|\\")+)"\s*\)/g; const importRe = /@use\s*\(\s*"(([^"]|\\")+)"\s*\)/g;
/** @param {string} code /** @param {WebAssembly.Instance} fmt
* @param {string} code
* @param {string[]} roots * @param {string[]} roots
* @param {Post[]} buf * @param {Post[]} buf
* @param {Set<string>} prevRoots * @param {Set<string>} prevRoots
* @returns {void} */ * @returns {void} */
function loadCachedPackages(code, roots, buf, prevRoots) { function loadCachedPackages(fmt, code, roots, buf, prevRoots) {
buf[0].code = code; buf[0].code = code;
roots.length = 0; roots.length = 0;
@ -162,7 +180,10 @@ function loadCachedPackages(code, roots, buf, prevRoots) {
for (let imp = roots.pop(); imp !== undefined; imp = roots.pop()) { for (let imp = roots.pop(); imp !== undefined; imp = roots.pop()) {
if (prevRoots.has(imp)) continue; prevRoots.add(imp); if (prevRoots.has(imp)) continue; prevRoots.add(imp);
buf.push({ path: imp, code: localStorage.getItem("package-" + imp) ?? never() });
const fmtd = modifyCode(fmt, localStorage.getItem("package-" + imp) ?? never(), "fmt");
if (typeof fmtd != "string") never();
buf.push({ path: imp, code: fmtd });
for (const match of buf[buf.length - 1].code.matchAll(importRe)) { for (const match of buf[buf.length - 1].code.matchAll(importRe)) {
roots.push(match[1]); roots.push(match[1]);
} }
@ -170,6 +191,61 @@ function loadCachedPackages(code, roots, buf, prevRoots) {
} }
/**@type{Set<string>}*/ const prevRoots = new Set(); /**@type{Set<string>}*/ const prevRoots = new Set();
/**@typedef {Object} PackageCtx
* @property {AbortController} [cancelation]
* @property {string[]} keyBuf
* @property {Set<string>} prevParams
* @property {HTMLTextAreaElement} [edit] */
/** @param {string} source @param {Set<string>} importDiff @param {HTMLPreElement} errors @param {PackageCtx} ctx */
async function fetchPackages(source, importDiff, errors, ctx) {
importDiff.clear();
for (const match of source.matchAll(importRe)) {
if (localStorage["package-" + match[1]]) continue;
importDiff.add(match[1]);
}
if (importDiff.size !== 0 && (ctx.prevParams.size != importDiff.size
|| [...ctx.prevParams.keys()].every(e => importDiff.has(e)))) {
if (ctx.cancelation) ctx.cancelation.abort();
ctx.prevParams.clear();
ctx.prevParams = new Set([...importDiff]);
ctx.cancelation = new AbortController();
ctx.keyBuf.length = 0;
ctx.keyBuf.push(...importDiff.keys());
errors.textContent = "fetching: " + ctx.keyBuf.join(", ");
await fetch(`/code`, {
method: "POST",
signal: ctx.cancelation.signal,
headers: { "Content-Type": "application/json" },
body: JSON.stringify(ctx.keyBuf),
}).then(async e => {
try {
const json = await e.json();
if (e.status == 200) {
for (const [key, value] of Object.entries(json)) {
localStorage["package-" + key] = value;
}
const missing = ctx.keyBuf.filter(i => json[i] === undefined);
if (missing.length !== 0) {
errors.textContent = "deps not found: " + missing.join(", ");
} else {
ctx.cancelation = undefined;
ctx.edit?.dispatchEvent(new InputEvent("input"));
}
}
} catch (er) {
errors.textContent = "completely failed to fetch ("
+ e.status + "): " + ctx.keyBuf.join(", ");
console.error(e, er);
}
});
}
}
/** @param {HTMLElement} target */ /** @param {HTMLElement} target */
async function bindCodeEdit(target) { async function bindCodeEdit(target) {
@ -188,72 +264,29 @@ async function bindCodeEdit(target) {
const hbc = await getHbcInstance(), fmt = await getFmtInstance(); const hbc = await getHbcInstance(), fmt = await getFmtInstance();
let importDiff = new Set(); let importDiff = new Set();
const keyBuf = [];
/**@type{Post[]}*/ /**@type{Post[]}*/
const packages = [{ path: "local.hb", code: "" }]; const packages = [{ path: "local.hb", code: "" }];
const debounce = 100; const debounce = 100;
/**@type{AbortController|undefined}*/
let cancelation = undefined;
let timeout = 0; let timeout = 0;
const ctx = { keyBuf: [], prevParams: new Set(), edit };
prevRoots.clear(); prevRoots.clear();
const onInput = () => { const onInput = () => {
importDiff.clear(); fetchPackages(edit.value, importDiff, errors, ctx);
for (const match of edit.value.matchAll(importRe)) {
if (localStorage["package-" + match[1]]) continue;
importDiff.add(match[1]);
}
if (importDiff.size !== 0) { if (ctx.cancelation && importDiff.size !== 0) {
if (cancelation) cancelation.abort();
cancelation = new AbortController();
keyBuf.length = 0;
keyBuf.push(...importDiff.keys());
errors.textContent = "fetching: " + keyBuf.join(", ");
fetch(`/code`, {
method: "POST",
signal: cancelation.signal,
headers: { "Content-Type": "application/json" },
body: JSON.stringify(keyBuf),
}).then(async e => {
try {
const json = await e.json();
if (e.status == 200) {
for (const [key, value] of Object.entries(json)) {
localStorage["package-" + key] = value;
}
const missing = keyBuf.filter(i => json[i] === undefined);
if (missing.length !== 0) {
errors.textContent = "deps not found: " + missing.join(", ");
} else {
cancelation = undefined;
edit.dispatchEvent(new InputEvent("input"));
}
}
} catch (er) {
errors.textContent = "completely failed to fetch ("
+ e.status + "): " + keyBuf.join(", ");
console.error(e, er);
}
});
}
if (cancelation && importDiff.size !== 0) {
return; return;
} }
loadCachedPackages(edit.value, keyBuf, packages, prevRoots); loadCachedPackages(fmt, edit.value, ctx.keyBuf, packages, prevRoots);
errors.textContent = compileCode(hbc, packages, 1); errors.textContent = compileCode(hbc, packages);
const minified_size = modifyCode(fmt, edit.value, "minify")?.length; const minified_size = modifyCode(fmt, edit.value, "minify")?.length;
if (minified_size) { if (minified_size) {
codeSize.textContent = (MAX_CODE_SIZE - minified_size) + ""; codeSize.textContent = (MAX_CODE_SIZE - minified_size) + "";
const perc = Math.min(100, Math.floor(100 * (minified_size / MAX_CODE_SIZE))); const perc = Math.min(100, Math.floor(100 * (minified_size / MAX_CODE_SIZE)));
codeSize.style.color = `color-mix(in srgb, white, var(--error) ${perc}%)`; codeSize.style.color = `color-mix(in srgb, light-dark(black, white), var(--error) ${perc}%)`;
} }
timeout = 0; timeout = 0;
}; };
@ -265,19 +298,86 @@ async function bindCodeEdit(target) {
edit.dispatchEvent(new InputEvent("input")); edit.dispatchEvent(new InputEvent("input"));
} }
/** @type {{ [key: string]: (content: string) => Promise<string> | string }} */ /**
* @type {Array<string>}
* to be synched with `enum TokenGroup` in bytecode/src/fmt.rs */
const TOK_CLASSES = [
'Blank',
'Comment',
'Keyword',
'Identifier',
'Directive',
'Number',
'String',
'Op',
'Assign',
'Paren',
'Bracket',
'Colon',
'Comma',
'Dot',
'Ctor',
];
/** @type {{ [key: string]: (el: HTMLElement) => void | Promise<void> }} */
const applyFns = { const applyFns = {
timestamp: (content) => new Date(parseInt(content) * 1000).toLocaleString(), timestamp: (el) => {
fmt: (content) => getFmtInstance().then(i => modifyCode(i, content, "fmt") ?? "invalid code"), const timestamp = el.innerText;
const date = new Date(parseInt(timestamp) * 1000);
el.innerText = date.toLocaleString();
},
fmt,
}; };
/**
* @param {HTMLElement} target */
async function fmt(target) {
const code = target.innerText;
const instance = await getFmtInstance();
const decoder = new TextDecoder('utf-8');
const fmt = modifyCode(instance, code, 'fmt');
if (typeof fmt !== "string") return;
const codeBytes = new TextEncoder().encode(fmt);
const tok = modifyCode(instance, fmt, 'tok');
if (!(tok instanceof Uint8Array)) return;
target.innerHTML = '';
let start = 0;
let kind = tok[0];
for (let ii = 1; ii <= tok.length; ii += 1) {
// split over same tokens and buffer end
if (tok[ii] === kind && ii < tok.length) {
continue;
}
const text = decoder.decode(codeBytes.subarray(start, ii));
const textNode = document.createTextNode(text);;
if (kind === 0) {
target.appendChild(textNode);
} else {
const el = document.createElement('span');
el.classList.add('syn');
el.classList.add(TOK_CLASSES[kind]);
el.appendChild(textNode);
target.appendChild(el);
}
if (ii == tok.length) {
break;
}
start = ii;
kind = tok[ii];
}
}
/** @param {HTMLElement} target */ /** @param {HTMLElement} target */
function execApply(target) { function execApply(target) {
const proises = [];
for (const elem of target.querySelectorAll('[apply]')) { for (const elem of target.querySelectorAll('[apply]')) {
if (!(elem instanceof HTMLElement)) continue; if (!(elem instanceof HTMLElement)) continue;
const funcname = elem.getAttribute('apply') ?? never(); const funcname = elem.getAttribute('apply') ?? never();
let res = applyFns[funcname](elem.textContent ?? ""); const vl = applyFns[funcname](elem);
if (res instanceof Promise) res.then(c => elem.textContent = c); if (vl instanceof Promise) proises.push(vl);
else elem.textContent = res; }
if (target === document.body) {
Promise.all(proises).then(() => document.body.hidden = false);
} }
} }
@ -332,10 +432,13 @@ function cacheInputs(target) {
} }
/** @param {string} [path] */ /** @param {string} [path] */
function updaetTab(path) { function updateTab(path) {
console.log(path);
for (const elem of document.querySelectorAll("button[hx-push-url]")) { for (const elem of document.querySelectorAll("button[hx-push-url]")) {
if (elem instanceof HTMLButtonElement) if (elem instanceof HTMLButtonElement)
elem.disabled = elem.getAttribute("hx-push-url") === (path ?? window.location.pathname); elem.disabled =
elem.getAttribute("hx-push-url") === path
|| elem.getAttribute("hx-push-url") === window.location.pathname;
} }
} }
@ -351,6 +454,7 @@ if (window.location.hostname === 'localhost') {
const code = "main:=fn():void{return}"; const code = "main:=fn():void{return}";
const inst = await getFmtInstance() const inst = await getFmtInstance()
const fmtd = modifyCode(inst, code, "fmt") ?? never(); const fmtd = modifyCode(inst, code, "fmt") ?? never();
if (typeof fmtd !== "string") never();
const prev = modifyCode(inst, fmtd, "minify") ?? never(); const prev = modifyCode(inst, fmtd, "minify") ?? never();
if (code != prev) console.error(code, prev); if (code != prev) console.error(code, prev);
} }
@ -360,7 +464,7 @@ if (window.location.hostname === 'localhost') {
code: "main:=fn():int{return 42}", code: "main:=fn():int{return 42}",
}]; }];
const res = compileCode(await getHbcInstance(), posts, 1) ?? never(); const res = compileCode(await getHbcInstance(), posts, 1) ?? never();
const expected = "exit code: 42\n"; const expected = "exit code: 42";
if (expected != res) console.error(expected, res); if (expected != res) console.error(expected, res);
} }
})() })()
@ -370,8 +474,7 @@ document.body.addEventListener('htmx:afterSwap', (ev) => {
if (!(ev.target instanceof HTMLElement)) never(); if (!(ev.target instanceof HTMLElement)) never();
wireUp(ev.target); wireUp(ev.target);
if (ev.target.tagName == "MAIN" || ev.target.tagName == "BODY") if (ev.target.tagName == "MAIN" || ev.target.tagName == "BODY")
updaetTab(ev['detail'].pathInfo.finalRequestPath); updateTab(ev['detail'].pathInfo.finalRequestPath);
console.log(ev);
}); });
getFmtInstance().then(inst => { getFmtInstance().then(inst => {
@ -422,6 +525,30 @@ getFmtInstance().then(inst => {
Object.assign(window, { filterCodeDeps }); Object.assign(window, { filterCodeDeps });
}); });
updaetTab(); /** @param {HTMLElement} target */
function runPost(target) {
while (!target.matches("div[class=preview]")) target = target.parentElement ?? never();
const code = target.querySelector("pre[apply=fmt]");
if (!(code instanceof HTMLPreElement)) never();
const output = target.querySelector("pre[id=compiler-output]");
if (!(output instanceof HTMLPreElement)) never();
Promise.all([getHbcInstance(), getFmtInstance()]).then(async ([hbc, fmt]) => {
const ctx = { keyBuf: [], prevParams: new Set() };
await fetchPackages(code.innerText ?? never(), new Set(), output, ctx);
const posts = [{ path: "this", code: "" }];
loadCachedPackages(fmt, code.innerText ?? never(), ctx.keyBuf, posts, new Set());
output.textContent = compileCode(hbc, posts);
output.hidden = false;
});
let author = encodeURIComponent(target.dataset.author ?? never());
let name = encodeURIComponent(target.dataset.name ?? never());
fetch(`/post/run?author=${author}&name=${name}`, { method: "POST" })
}
Object.assign(window, { runPost });
updateTab();
wireUp(document.body); wireUp(document.body);

View file

@ -1,10 +1,10 @@
#![feature(iter_collect_into)] #![feature(iter_collect_into, macro_metavar_expr)]
use { use {
argon2::{password_hash::SaltString, PasswordVerifier}, argon2::{password_hash::SaltString, PasswordVerifier},
axum::{ axum::{
body::Bytes, body::Bytes,
extract::Path, extract::{DefaultBodyLimit, Path},
http::{header::COOKIE, request::Parts}, http::{header::COOKIE, request::Parts, StatusCode},
response::{AppendHeaders, Html}, response::{AppendHeaders, Html},
}, },
const_format::formatcp, const_format::formatcp,
@ -52,22 +52,27 @@ async fn amain() {
db::init(); db::init();
let router = axum::Router::new() let router = axum::Router::new()
.route("/", get(Index::page))
.route("/index.css", static_asset!("text/css", "index.css")) .route("/index.css", static_asset!("text/css", "index.css"))
.route("/index.js", static_asset!("text/javascript", "index.js")) .route("/index.js", static_asset!("text/javascript", "index.js"))
.route("/hbfmt.wasm", static_asset!("application/wasm", "hbfmt.wasm")) .route("/hbfmt.wasm", static_asset!("application/wasm", "hbfmt.wasm"))
.route("/hbc.wasm", static_asset!("application/wasm", "hbc.wasm")) .route("/hbc.wasm", static_asset!("application/wasm", "hbc.wasm"))
.route("/index-view", get(Index::get)) .route("/", get(Index::page))
.route("/index-view", get(Index::get_with_blog))
.route("/blogs/index-view", get(Index::get))
.route("/blogs/developing-hblang", get(DevelopingHblang::page))
.route("/blogs/developing-hblang-view", get(DevelopingHblang::get))
.route("/feed", get(Feed::page)) .route("/feed", get(Feed::page))
.route("/feed-view", get(Feed::get)) .route("/feed-view", get(Feed::get))
.route("/feed-more", post(Feed::more)) .route("/feed-more", post(Feed::more))
.route("/profile", get(Profile::page)) .route("/profile", get(Profile::page))
.route("/profile-view", get(Profile::get)) .route("/profile-view", get(Profile::get))
.route("/profile/:name", get(Profile::get_other_page)) .route("/profile/:name", get(Profile::get_other_page))
.route("/profile/password", post(PasswordChange::post))
.route("/profile-view/:name", get(Profile::get_other)) .route("/profile-view/:name", get(Profile::get_other))
.route("/post", get(Post::page)) .route("/post", get(Post::page))
.route("/post-view", get(Post::get)) .route("/post-view", get(Post::get))
.route("/post", post(Post::post)) .route("/post", post(Post::post))
.route("/post/run", post(Post::run))
.route("/code", post(fetch_code)) .route("/code", post(fetch_code))
.route("/login", get(Login::page)) .route("/login", get(Login::page))
.route("/login-view", get(Login::get)) .route("/login-view", get(Login::get))
@ -85,7 +90,8 @@ async fn amain() {
.as_millis(); .as_millis();
move || async move { id.to_string() } move || async move { id.to_string() }
}), }),
); )
.layer(DefaultBodyLimit::max(16 * 1024));
#[cfg(feature = "tls")] #[cfg(feature = "tls")]
{ {
@ -196,12 +202,45 @@ impl Page for Feed {
} }
} }
#[derive(Default)] macro_rules! decl_static_pages {
struct Index; ($(
#[derive(PublicPage)]
#[page(static = $file:literal)]
struct $name:ident;
)*) => {
const ALL_STATIC_PAGES: [&str; ${count($file)}] = [$($file),*];
impl PublicPage for Index { $(
#[derive(Default)]
struct $name;
impl PublicPage for $name {
fn render_to_buf(self, buf: &mut String) { fn render_to_buf(self, buf: &mut String) {
buf.push_str(include_str!("welcome-page.html")); buf.push_str(include_str!(concat!("static-pages/", $file, ".html")));
}
async fn page(session: Option<Session>) -> Html<String> {
base(|s| blog_base(s, |s| Self::default().render_to_buf(s)), session.as_ref())
}
}
)*
};
}
decl_static_pages! {
#[derive(PublicPage)]
#[page(static = "welcome")]
struct Index;
#[derive(PublicPage)]
#[page(static = "developing-hblang")]
struct DevelopingHblang;
}
impl Index {
async fn get_with_blog() -> Html<String> {
let mut buf = String::new();
blog_base(&mut buf, |s| Index.render_to_buf(s));
Html(buf)
} }
} }
@ -239,11 +278,27 @@ impl Page for Post {
<input type="submit" value="submit"> <input type="submit" value="submit">
<pre id="compiler-output"></pre> <pre id="compiler-output"></pre>
</form> </form>
!{include_str!("post-page.html")}
<div id="dep-list">
<input placeholder="search impoted deps.." oninput="filterCodeDeps(this, event)">
<section id="deps">
"results show here..."
</section>
</div>
<div>
!{include_str!("static-pages/post.html")}
</div>
} }
} }
} }
#[derive(Deserialize)]
struct Run {
author: String,
name: String,
}
impl Post { impl Post {
pub fn from_row(r: &rusqlite::Row) -> rusqlite::Result<Self> { pub fn from_row(r: &rusqlite::Row) -> rusqlite::Result<Self> {
Ok(Post { Ok(Post {
@ -251,10 +306,25 @@ impl Post {
name: r.get(1)?, name: r.get(1)?,
timestamp: r.get(2)?, timestamp: r.get(2)?,
code: r.get(3)?, code: r.get(3)?,
imports: r.get(4)?,
runs: r.get(5)?,
..Default::default() ..Default::default()
}) })
} }
async fn run(
session: Session,
axum::extract::Query(run): axum::extract::Query<Run>,
) -> StatusCode {
match db::with(|qes| qes.creata_run.insert((run.name, run.author, session.name))) {
Ok(_) => StatusCode::OK,
Err(e) => {
log::error!("creating run record failed: {e}");
StatusCode::INTERNAL_SERVER_ERROR
}
}
}
async fn post( async fn post(
session: Session, session: Session,
axum::Form(mut data): axum::Form<Self>, axum::Form(mut data): axum::Form<Self>,
@ -310,7 +380,7 @@ impl Post {
impl fmt::Display for Post { impl fmt::Display for Post {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let Self { author, name, timestamp, imports, runs, dependencies, code, .. } = self; let Self { author, name, timestamp, imports, runs, dependencies, code, .. } = self;
write_html! { f <div class="preview"> write_html! { f <div class="preview" "data-author"=author "data-name"=name>
<div class="info"> <div class="info">
<span> <span>
<a "hx-get"={format_args!("/profile-view/{author}")} href="" "hx-target"="main" <a "hx-get"={format_args!("/profile-view/{author}")} href="" "hx-target"="main"
@ -320,16 +390,21 @@ impl fmt::Display for Post {
name name
</span> </span>
<span apply="timestamp">timestamp</span> <span apply="timestamp">timestamp</span>
</div> for (name, count) in [include_str!("icons/download.svg"), include_str!("icons/run.svg"), "deps"]
<div class="stats"> .iter()
for (name, count) in "inps runs deps".split(' ')
.zip([imports, runs, dependencies]) .zip([imports, runs, dependencies])
.filter(|(_, &c)| c != 0) .filter(|(_, &c)| c != 0)
{ {
name ": "<span>count</span> <div class="stat">!name count</div>
} }
</div> </div>
<div class="code">
<nav>
<button onmousedown="runPost(this)">!{include_str!("icons/run.svg")}</button>
</nav>
<pre apply="fmt">code</pre> <pre apply="fmt">code</pre>
</div>
<pre hidden id="compiler-output"></pre>
if *timestamp == 0 { if *timestamp == 0 {
<button "hx-get"="/post" "hx-swap"="outerHTML" <button "hx-get"="/post" "hx-swap"="outerHTML"
"hx-target"="[preview]">"edit"</button> "hx-target"="[preview]">"edit"</button>
@ -339,6 +414,70 @@ impl fmt::Display for Post {
} }
} }
#[derive(Deserialize, Default)]
struct PasswordChange {
old_password: String,
new_password: String,
#[serde(skip)]
error: Option<&'static str>,
}
impl PasswordChange {
async fn post(
session: Session,
axum::Form(mut change): axum::Form<PasswordChange>,
) -> Html<String> {
db::with(|que| {
match que.authenticate.query_row((&session.name,), |r| r.get::<_, String>(1)) {
Ok(hash) if verify_password(&hash, &change.old_password).is_err() => {
change.error = Some("invalid credentials");
}
Ok(_) => {
let new_hashed = hash_password(&change.new_password);
match que
.change_passowrd
.execute((new_hashed, &session.name))
.log("execute update")
{
None => change.error = Some("intenal server error"),
Some(0) => change.error = Some("password is incorrect"),
Some(_) => {}
}
}
Err(rusqlite::Error::QueryReturnedNoRows) => {
change.error = Some("invalid credentials");
}
Err(e) => {
log::error!("login queri failed: {e}");
change.error = Some("internal server error");
}
}
});
if change.error.is_some() {
change.render(&session)
} else {
PasswordChange::default().render(&session)
}
}
}
impl Page for PasswordChange {
fn render_to_buf(self, _: &Session, buf: &mut String) {
let Self { old_password, new_password, error } = self;
write_html! { (buf)
<form "hx-post"="/profile/password" "hx-swap"="outerHTML">
if let Some(e) = error { <div class="error">e</div> }
<input name="old_password" type="password" autocomplete="old-password"
placeholder="old password" value=old_password>
<input name="new_password" type="password" autocomplete="new-password" placeholder="new password"
value=new_password>
<input type="submit" value="submit">
</form>
}
}
}
#[derive(Default)] #[derive(Default)]
struct Profile { struct Profile {
other: Option<String>, other: Option<String>,
@ -357,20 +496,24 @@ impl Profile {
impl Page for Profile { impl Page for Profile {
fn render_to_buf(self, session: &Session, buf: &mut String) { fn render_to_buf(self, session: &Session, buf: &mut String) {
db::with(|db| { db::with(|db| {
let name = self.other.as_ref().unwrap_or(&session.name);
let iter = db let iter = db
.get_user_posts .get_user_posts
.query_map((self.other.as_ref().unwrap_or(&session.name),), Post::from_row) .query_map((name,), Post::from_row)
.log("get user posts query") .log("get user posts query")
.into_iter() .into_iter()
.flatten() .flatten()
.filter_map(|p| p.log("user post row")); .filter_map(|p| p.log("user post row"));
write_html! { (buf) write_html! { (*buf)
if name == &session.name {
|b|{PasswordChange::default().render_to_buf(session, b)}
}
for post in iter { for post in iter {
!{post} !{post}
} else { } else {
"no posts" "no posts"
} }
!{include_str!("profile-page.html")}
} }
}) })
} }
@ -399,7 +542,7 @@ struct Login {
impl PublicPage for Login { impl PublicPage for Login {
fn render_to_buf(self, buf: &mut String) { fn render_to_buf(self, buf: &mut String) {
let Login { name, password, error } = self; let Self { name, password, error } = self;
write_html! { (buf) write_html! { (buf)
<form "hx-post"="/login" "hx-swap"="outerHTML"> <form "hx-post"="/login" "hx-swap"="outerHTML">
if let Some(e) = error { <div class="error">e</div> } if let Some(e) = error { <div class="error">e</div> }
@ -439,13 +582,12 @@ impl Login {
data.error = Some("invalid credentials"); data.error = Some("invalid credentials");
} }
Err(e) => { Err(e) => {
log::error!("foo {e}"); log::error!("login queri failed: {e}");
data.error = Some("internal server error"); data.error = Some("internal server error");
} }
}); });
if data.error.is_some() { if data.error.is_some() {
log::error!("what {:?}", data);
Err(data.render()) Err(data.render())
} else { } else {
Ok(AppendHeaders([ Ok(AppendHeaders([
@ -534,6 +676,29 @@ impl Signup {
} }
} }
fn blog_base(s: &mut String, body: impl FnOnce(&mut String)) {
let nav_button = |f: &mut String, name: &str| {
write_html! {(f)
<button "hx-push-url"={format_args!("/blogs/{name}")}
"hx-get"={format_args!("/blogs/{name}-view")}
"hx-target"="main#blog"
"hx-swap"="innerHTML">name</button>
}
};
write_html! {(*s)
<nav><section>
<button "hx-push-url"="/" "hx-get"="/blogs/index-view"
"hx-target"="main#blog" "hx-swap"="innerHTML">"welcome"</button>
for name in &ALL_STATIC_PAGES[1..] {
|f|{nav_button(f, name)}
}
</section></nav>
<section id="post-form"></section>
<main id="blog">|f|{body(f)}</main>
}
}
fn base(body: impl FnOnce(&mut String), session: Option<&Session>) -> Html<String> { fn base(body: impl FnOnce(&mut String), session: Option<&Session>) -> Html<String> {
let username = session.map(|s| &s.name); let username = session.map(|s| &s.name);
@ -552,14 +717,16 @@ fn base(body: impl FnOnce(&mut String), session: Option<&Session>) -> Html<Strin
<head> <head>
<meta name="charset" content="UTF-8"> <meta name="charset" content="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1"> <meta name="viewport" content="width=device-width, initial-scale=1">
<meta name="description" content="code dependency hell socila media hblang">
<link rel="stylesheet" href="/index.css"> <link rel="stylesheet" href="/index.css">
<title>"depell"</title>
</head> </head>
<body> <body hidden>
<nav> <nav>
<button "hx-push-url"="/" "hx-get"="/index-view" "hx-target"="main" "hx-swap"="innerHTML">"depell"</button> <button "hx-push-url"="/" "hx-get"="/index-view" "hx-target"="main" "hx-swap"="innerHTML">"depell"</button>
<section> <section>
if let Some(username) = username { if let Some(username) = username {
<button "hx-push-url"="/profile" "hx-get"="/profile-view" "hx-target"="main" <button "hx-push-url"={format_args!("/profile/{username}")} "hx-get"="/profile-view" "hx-target"="main"
"hx-swap"="innerHTML">username</button> "hx-swap"="innerHTML">username</button>
|f|{nav_button(f, "feed"); nav_button(f, "post")} |f|{nav_button(f, "feed"); nav_button(f, "post")}
<button "hx-delete"="/login">"logout"</button> <button "hx-delete"="/login">"logout"</button>
@ -682,13 +849,58 @@ mod db {
gen_queries! { gen_queries! {
pub struct Queries { pub struct Queries {
register: "INSERT INTO user (name, password_hash) VALUES(?, ?)", register: "INSERT INTO user (name, password_hash) VALUES(?, ?)",
change_passowrd: "UPDATE user SET password_hash = ? WHERE name = ?",
authenticate: "SELECT name, password_hash FROM user WHERE name = ?", authenticate: "SELECT name, password_hash FROM user WHERE name = ?",
login: "INSERT OR REPLACE INTO session (id, username, expiration) VALUES(?, ?, ?)", login: "INSERT OR REPLACE INTO session (id, username, expiration) VALUES(?, ?, ?)",
logout: "DELETE FROM session WHERE id = ?", logout: "DELETE FROM session WHERE id = ?",
get_session: "SELECT username, expiration FROM session WHERE id = ?", get_session: "SELECT username, expiration FROM session WHERE id = ?",
get_user_posts: "SELECT author, name, timestamp, code FROM post WHERE author = ? get_user_posts: "SELECT author, name, timestamp, code, (
ORDER BY timestamp DESC", WITH RECURSIVE roots(name, author, code) AS (
get_pots_before: "SELECT author, name, timestamp, code FROM post WHERE timestamp < ?", SELECT name, author, code FROM post WHERE name = outher.name AND author = outher.author
UNION
SELECT post.name, post.author, post.code FROM
post JOIN import ON post.name = import.from_name
AND post.author = import.from_author
JOIN roots ON import.to_name = roots.name
AND import.to_author = roots.author
) SELECT (count(*) - 1) FROM roots
) AS imports, (
WITH RECURSIVE roots(name, author, code) AS (
SELECT name, author, code FROM post WHERE name = outher.name AND author = outher.author
UNION
SELECT post.name, post.author, post.code FROM post
JOIN import ON post.name = import.from_name
AND post.author = import.from_author
JOIN roots ON import.to_name = roots.name
AND import.to_author = roots.author
) SELECT count(*) FROM roots
JOIN run ON roots.name = run.code_name
AND roots.author = run.code_author
) AS runs FROM post as outher WHERE author = ? ORDER BY timestamp DESC",
// TODO: we might want to cache the recursive queries
get_pots_before: "SELECT author, name, timestamp, code, (
WITH RECURSIVE roots(name, author, code) AS (
SELECT name, author, code FROM post WHERE name = outher.name AND author = outher.author
UNION
SELECT post.name, post.author, post.code FROM
post JOIN import ON post.name = import.from_name
AND post.author = import.from_author
JOIN roots ON import.to_name = roots.name
AND import.to_author = roots.author
) SELECT (count(*) - 1) FROM roots
) AS imports, (
WITH RECURSIVE roots(name, author, code) AS (
SELECT name, author, code FROM post WHERE name = outher.name AND author = outher.author
UNION
SELECT post.name, post.author, post.code FROM post
JOIN import ON post.name = import.from_name
AND post.author = import.from_author
JOIN roots ON import.to_name = roots.name
AND import.to_author = roots.author
) SELECT count(*) FROM roots
JOIN run ON roots.name = run.code_name
AND roots.author = run.code_author
) as runs FROM post AS outher WHERE timestamp < ?",
create_post: "INSERT INTO post (name, author, timestamp, code) VALUES(?, ?, ?, ?)", create_post: "INSERT INTO post (name, author, timestamp, code) VALUES(?, ?, ?, ?)",
fetch_deps: " fetch_deps: "
WITH RECURSIVE roots(name, author, code) AS ( WITH RECURSIVE roots(name, author, code) AS (
@ -703,6 +915,7 @@ mod db {
", ",
create_import: "INSERT INTO import(to_author, to_name, from_author, from_name) create_import: "INSERT INTO import(to_author, to_name, from_author, from_name)
VALUES(?, ?, ?, ?)", VALUES(?, ?, ?, ?)",
creata_run: "INSERT OR IGNORE INTO run(code_name, code_author, runner) VALUES(?, ?, ?)",
} }
} }
@ -729,8 +942,22 @@ mod db {
} }
pub fn init() { pub fn init() {
const SCHEMA_VERSION: usize = 0;
const MIGRATIONS: &[&str] = &[include_str!("migrations/1.sql")];
let db = rusqlite::Connection::open("db.sqlite").unwrap(); let db = rusqlite::Connection::open("db.sqlite").unwrap();
db.execute_batch(include_str!("schema.sql")).unwrap(); db.execute_batch(include_str!("schema.sql")).unwrap();
let schema_version =
db.pragma_query_value(None, "user_version", |v| v.get::<_, usize>(0)).unwrap();
if schema_version != SCHEMA_VERSION {
for &mig in &MIGRATIONS[schema_version..] {
db.execute_batch(mig).expect(mig);
}
db.pragma_update(None, "user_version", SCHEMA_VERSION).unwrap();
}
Queries::new(&db); Queries::new(&db);
} }
} }

View file

@ -1,21 +0,0 @@
<div id="dep-list">
<input placeholder="search impoted deps.." oninput="filterCodeDeps(this, event)">
<section id="deps">
results show here...
</section>
</div>
<div>
<h3>About posting code</h3>
<p>
If you are unfammiliar with <a href="https://git.ablecorp.us/AbleOS/holey-bytes">hblang</a>, refer to the
<strong>hblang/README.md</strong> or
vizit <a href="/profile/mlokis">mlokis'es posts</a>. Preferably don't edit the code here.
</p>
<h3>Extra textarea features</h3>
<ul>
<li>proper tab behaviour</li>
<li>snap to previous tab boundary on "empty" lines</li>
</ul>

View file

@ -0,0 +1,61 @@
# The journey to an optimizing compiler
It's been years since I was continuously trying to make a compiler to implement language of my dreams. Problem was tho that I wanted something similar to Rust, which if you did not know, `rustc` far exceeded the one million lines of code mark some time ago, so implementing such language would take me years if not decades, but I still tired it.
Besides being extremely ambitions, the problem with my earliest attempts at making a compiler, is that literally nobody, not even me, was using the language, and so retroactively I am confident, what I implemented was a complex test-case implementation, and not a compiler. I often fall into a trap of implementing edge cases instead of an algorithm that would handle not only the very few thing the tests do but also all the other stuff that users of the language would try.
Another part of why I was failing for all that time, is that I did the hardest thing first without understanding the core concepts involved in translating written language to IR, god forbid assembly. I wasted a lot of time like this, but at least I learned Rust well. At some point I found a job where I started developing a decentralized network and that fully drawn me away from language development.
## Completely new approach
At some point the company I was working for started having financial issues and they were unable to pay me. During that period, I discovered that my love for networking was majorly fueled by the monetary gains associated with it. I burned out, and started to look for things to do with the free time.
One could say timing was perfect because [`ableos`](https://git.ablecorp.us/AbleOS/ableos) was desperately in need of a sane programming language that compiles to the home made VM ISA used for all software ran in `ableos`, but there was nobody crazy enough to do this. I got terribly nerd sniped, tho I don't regret it. Process of making a language for `ableos` was completely different. Firstly, it needed to be done asap, the lack of a good language blocked everyone form writing drivers for `ableos`, secondly, the moment the language is at least a little bit usable, people other then me will start using it, and lastly, the ISA the language compiles to very simple to emit, understand, and run.
### Urgency is a bliss
I actually managed to make the language somewhat work in one week, mainly because my mind set changed. I no longer spent a lot of time designing syntax for elegance, I designed it so that it incredibly easy to parse, meaning I can spent minimal effort implementing the parser, and fully focus on the hard problem of translating AST to instructions. Surprisingly, making everything an expression and not enforcing any arbitrary rules, makes the code you can write incredibly flexible and (most) people love it. One of the decisions I made to save time (or maybe it was an accident) was to make `,;` not enforced, meaning, you are allowed to write delimiters in lists but, as long as it does not change the intent of the code, you can leave them out. In practice, you actually don't need semicolons, unless the next line starts with something sticky like `*x`, int that case you put a semicolon on the previous line to tell the parser where the current expression ends.
### Only the problem I care about
Its good to note that writing a parser is no longer interesting for me. I wrote many parsers before and writing one no longer feel rewarding, but more like a chore. The real problem I was excited about was translating AST to instructions, I always ended up overcomplicating this step wit edge cases for every possible scenario that can happen in code, for which there are infinite. But why did I succeed this time? Well all the friction related to getting something that I can execute was so low, I could iterate quickly and realize what I am doing wrong before I burn out. In a week I managed to understand what I was failing to do for years, partly because of all the previous suffering, but mainly because it was so easy to pivot and try new things. And so I managed to make my first single pass compiler, and people immediately started using it.
### Don't implement features nobody asked for
Immediately after someone else then me wrote something in `hb` stuff started breaking, over the course of a month I kept fixing bugs and adding new features just fine, and more people started to use the language. All was good and well until I looked into the code. It was incredibly cursed, full of tricks to work around the compiler not doing any optimizations. At that moment I realized the whole compiler after parser needs to be rewritten, I had to implement optimizations, otherwise people wont be able to write readable code that runs fast. All of the features I have added up until now, were a technical dept now. Unless they are all working with optimizations, can't compile the existing code. Yes, if feature exists, be sure as hell it will be used.
It took around 4 months to reimplement everything make make the optimal code look like what you are used to in other languages. I am really thankful for [sea of nodes](https://github.com/SeaOfNodes), and all the amazing work Cliff Click and others do to make demystify optimizers, It would have taken much longer to for me to figure all the principles out without the exhaustive [tutorial](https://github.com/SeaOfNodes/Simple?tab=readme-ov-file).
## How my understanding of optimizations changed
### Optimizations allow us to scale software
I need to admit, before writing a single pass compiler and later upgrading it to optimizing one, I thought optimizations only affect the quality of final assembly emitted by the compiler. It never occur to me that what the optimizations actually do, is reduce the impact of how you decide to write the code. In a single pass compiler (with zero optimizations), the machine code reflects:
- order of operations as written in code
- whether the value was stored in intermediate locations
- exact structure of the control flow and at which point the operations are placed
- how many times is something recomputed
- operations that only help to convey intent for the reader of the source code
- and more I can't think of...
If you took some code you wrote and then modified it to obfuscate these aspects (in reference to the original code), you would to a subset of what optimizing compiler does. Of course, a good compiler would try hard to improve the metrics its optimizing for, it would:
- reorder operations to allow the CPU to parallelize them
- remove needless stores, or store values directly to places you cant express in code
- pull operations out of the loops and into the branches (if it can)
- find all common sub-expressions and compute them only once
- fold constants as much as possible and use obscure tricks to replace slow instructions if any of the operands are constant
- and more...
In the end, compiler optimizations try to reduce correlation between how the code happens to be written and how well it performs, which is extremely important when you want humans to read the code.
### Optimizing compilers know more then you
Optimizing code is a search problem, an optimizer searches the code for patterns that can be rewritten so something more practical for the computer, while preserving the observable behavior of the program. This means it needs enough context about the code to not make a mistake. In fact, the optimizer has so much context, it is able to determine your code is useless. But wait, didn't you write the code because you needed it to do something? Maybe your intention was to break out of the loop after you are done, but the optimizer looked at the code and said, "great, we are so lucky that this integer is always small enough to miss this check by one, DELETE", and then he goes "jackpot, since this loop is now infinite, we don't need this code after it, DELETE". Notice that the optimizer is eager to delete dead code, it did not ask you "Brah, why did you place all your code after an infinite loop?". This is just an example, there are many more cases where modern optimizers just delete all your code because they proven it does something invalid without running it.
Its stupid but its the world we live in, optimizers are usually a black box you import and feed it the code in a format they understand, they then proceed to optimize it, and if they find a glaring bug they wont tell you, god forbid, they will just molest the code in unspecified ways and spit out whats left. Before writing an optimizer, I did no know this can happen and I did not know this is a problem I pay for with my time, spent figuring out why noting is happening when I run the program.
But wait its worse! Since optimizers wont ever share the fact you are stupid, we end up with other people painstakingly writing complex linters, that will do a shitty job detecting things that matter, and instead whine about style and other bullcrap (and they suck even at that). If the people who write linters and people who write optimizers swapped the roles, I would be ranting about optimizers instead.
And so, this is the area where I want to innovate, lets report the dead code to the frontend, and let the compiler frontend filter out the noise and show relevant information in the diagnostics. Refuse to compile the program if you `i /= 0`. Refuse to compile if you `arr[arr.len]`. This is the level of stupid optimizer sees, once it normalizes your code, but proceeds to protect your feelings. My goal so for hblang to relay this to you as much as possible. If we can query for optimizations, we can query for bugs too.

View file

@ -0,0 +1,8 @@
### About posting code
If you are unfammiliar with [hblang](https://git.ablecorp.us/AbleOS/holey-bytes), refer to the **hblang/README.md** or vizit [mlokis'es posts](/profile/mlokis). Preferably don't edit the code here.
### Extra textarea features
- proper tab behaviour
- snap to previous tab boundary on "empty" lines

View file

@ -0,0 +1,11 @@
## Welcome to depell
Depell (dependency hell) is a simple "social" media site, except that all you can post is [hblang](https://git.ablecorp.us/AbleOS/holey-bytes) code. Instead of likes you run the program, and instead of mentions you import the program as dependency. Run counts even when ran indirectly.
The backend only serves the code and frontend compiles and runs it locally. All posts are immutable.
## Security?
All code runs in WASM (inside a holey-bytes VM until hblang compiles to wasm) and is controlled by JavaScript. WASM
cant do any form of IO without going trough JavaScript so as long as JS import does not allow wasm to execute
arbitrary JS code, WASM can act as a container inside the JS.

View file

@ -1,17 +0,0 @@
<h1>Welcome to depell</h1>
<p>
Depell (dependency hell) is a simple "social" media site best compared to twitter, except that all you can post is
<a href="https://git.ablecorp.us/AbleOS/holey-bytes">hblang</a> code with no comments allowed. Instead of likes you
run the program, and instead of retweets you import the program as dependency. Run counts even when ran indirectly.
</p>
<p>
The backend only serves the code and frontend compiles and runs it locally. All posts are immutable.
</p>
<h2>Security?</h2>
<p>
All code runs in WASM (inside a holey-bytes VM until hblang compiles to wasm) and is controlled by JavaScript. WASM
cant do any form of IO without going trough JavaScript so as long as JS import does not allow wasm to execute
arbitrary JS code, WASM can act as a container inside the JS.
</p>

View file

@ -27,8 +27,16 @@ unsafe extern "C" fn fmt() {
OUTPUT_LEN = MAX_OUTPUT_SIZE - f.0.len(); OUTPUT_LEN = MAX_OUTPUT_SIZE - f.0.len();
} }
#[no_mangle]
unsafe extern "C" fn tok() {
let code = core::slice::from_raw_parts_mut(
core::ptr::addr_of_mut!(OUTPUT).cast(), OUTPUT_LEN);
OUTPUT_LEN = fmt::get_token_kinds(code);
}
#[no_mangle] #[no_mangle]
unsafe extern "C" fn minify() { unsafe extern "C" fn minify() {
let code = core::str::from_raw_parts_mut(core::ptr::addr_of_mut!(OUTPUT).cast(), OUTPUT_LEN); let code = core::str::from_raw_parts_mut(
core::ptr::addr_of_mut!(OUTPUT).cast(), OUTPUT_LEN);
OUTPUT_LEN = fmt::minify(code); OUTPUT_LEN = fmt::minify(code);
} }

View file

@ -4,9 +4,12 @@
use { use {
alloc::{string::String, vec::Vec}, alloc::{string::String, vec::Vec},
core::ffi::CStr,
hblang::{ hblang::{
parser::FileId, backend::hbvm::HbvmBackend,
son::{hbvm::HbvmBackend, Codegen, CodegenCtx}, son::{Codegen, CodegenCtx},
ty::Module,
Ent,
}, },
}; };
@ -60,7 +63,7 @@ unsafe fn compile_and_run(mut fuel: usize) {
let files = { let files = {
let paths = files.iter().map(|f| f.path).collect::<Vec<_>>(); let paths = files.iter().map(|f| f.path).collect::<Vec<_>>();
let mut loader = |path: &str, _: &str, kind| match kind { let mut loader = |path: &str, _: &str, kind| match kind {
hblang::parser::FileKind::Module => Ok(paths.binary_search(&path).unwrap() as FileId), hblang::parser::FileKind::Module => Ok(paths.binary_search(&path).unwrap()),
hblang::parser::FileKind::Embed => Err("embeds are not supported".into()), hblang::parser::FileKind::Embed => Err("embeds are not supported".into()),
}; };
files files
@ -79,7 +82,7 @@ unsafe fn compile_and_run(mut fuel: usize) {
let mut ct = { let mut ct = {
let mut backend = HbvmBackend::default(); let mut backend = HbvmBackend::default();
Codegen::new(&mut backend, &files, &mut ctx).generate(root as FileId); Codegen::new(&mut backend, &files, &mut ctx).generate(Module::new(root));
if !ctx.parser.errors.borrow().is_empty() { if !ctx.parser.errors.borrow().is_empty() {
log::error!("{}", ctx.parser.errors.borrow()); log::error!("{}", ctx.parser.errors.borrow());
@ -97,8 +100,15 @@ unsafe fn compile_and_run(mut fuel: usize) {
break; break;
} }
Ok(hbvm::VmRunOk::Ecall) => { Ok(hbvm::VmRunOk::Ecall) => {
let unknown = ct.vm.read_reg(2).0; let kind = ct.vm.read_reg(2).0;
log::error!("unknown ecall: {unknown}") match kind {
0 => {
let str = ct.vm.read_reg(3).0;
let str = unsafe { CStr::from_ptr(str as _) };
log::error!("{}", str.to_str().unwrap());
}
unknown => log::error!("unknown ecall: {unknown}"),
}
} }
Ok(hbvm::VmRunOk::Timer) => { Ok(hbvm::VmRunOk::Timer) => {
fuel -= 1; fuel -= 1;

View file

@ -12,17 +12,12 @@ name = "fuzz"
path = "src/fuzz_main.rs" path = "src/fuzz_main.rs"
[dependencies] [dependencies]
hashbrown = { version = "0.15.0", default-features = false, features = ["raw-entry", "allocator-api2"] }
hbbytecode = { workspace = true, features = ["disasm"] } hbbytecode = { workspace = true, features = ["disasm"] }
hbvm = { workspace = true, features = ["nightly"] } hbvm = { workspace = true, features = ["nightly", "alloc"] }
hashbrown = { version = "0.15.0", default-features = false, features = ["raw-entry"] }
log = "0.4.22" log = "0.4.22"
[dependencies.regalloc2]
git = "https://github.com/jakubDoka/regalloc2"
branch = "reuse-allocations"
default-features = false
[features] [features]
default = ["std", "regalloc2/trace-log"] default = ["std"]
std = [] std = []
no_log = ["log/max_level_off"] no_log = ["log/max_level_off"]

File diff suppressed because one or more lines are too long

35
lang/build.rs Normal file
View file

@ -0,0 +1,35 @@
use std::{fmt::Write, iter};
fn main() {
const TEST_FILE: &str = "src/testcases.rs";
const INPUT: &str = include_str!("./README.md");
let mut out = String::new();
for (name, code) in block_iter(INPUT) {
let name = name.replace(' ', "_");
_ = writeln!(
out,
"#[test] fn {name}() {{ run_codegen_test(\"{name}\", r##\"{code}\"##) }}"
);
}
std::fs::write(TEST_FILE, out).unwrap();
}
fn block_iter(mut input: &str) -> impl Iterator<Item = (&str, &str)> {
const CASE_PREFIX: &str = "#### ";
const CASE_SUFFIX: &str = "\n```hb";
iter::from_fn(move || loop {
let pos = input.find(CASE_PREFIX)?;
input = unsafe { input.get_unchecked(pos + CASE_PREFIX.len()..) };
let Some((test_name, rest)) = input.split_once(CASE_SUFFIX) else { continue };
if !test_name.chars().all(|c| c.is_alphanumeric() || c == '_') {
continue;
}
input = rest;
let (body, rest) = input.split_once("```").unwrap_or((input, ""));
input = rest;
break Some((test_name, body));
})
}

View file

@ -2,3 +2,4 @@
--fmt-stdout - dont write the formatted file but print it --fmt-stdout - dont write the formatted file but print it
--dump-asm - output assembly instead of raw code, (the assembly is more for debugging the compiler) --dump-asm - output assembly instead of raw code, (the assembly is more for debugging the compiler)
--threads <1...> - number of extra threads compiler can use [default: 0] --threads <1...> - number of extra threads compiler can use [default: 0]
--path-resolver <name> - choose between builtin path resolvers, options are: ableos

View file

@ -1,19 +1,66 @@
use { use {
super::{AssemblySpec, Backend, Nid, Node, Nodes}, super::{AssemblySpec, Backend},
crate::{ crate::{
lexer::TokenKind, lexer::TokenKind,
parser, reg, nodes::{Kind, Nid, Nodes, MEM},
son::{debug_assert_matches, write_reloc, Kind, MEM}, parser,
ty::{self, Loc}, ty::{self, Loc, Module, Offset, Size, Types},
Offset, Reloc, Size, TypedReloc, Types, utils::{EntSlice, EntVec},
}, },
alloc::{boxed::Box, collections::BTreeMap, string::String, vec::Vec}, alloc::{boxed::Box, collections::BTreeMap, string::String, vec::Vec},
core::mem, core::{assert_matches::debug_assert_matches, mem, ops::Range},
hbbytecode::{self as instrs, *}, hbbytecode::{self as instrs, *},
reg::Reg,
}; };
mod my_regalloc; mod regalloc;
mod their_regalloc;
mod reg {
pub const STACK_PTR: Reg = 254;
pub const ZERO: Reg = 0;
pub const RET: Reg = 1;
pub const RET_ADDR: Reg = 31;
pub type Reg = u8;
}
fn write_reloc(doce: &mut [u8], offset: usize, value: i64, size: u16) {
let value = value.to_ne_bytes();
doce[offset..offset + size as usize].copy_from_slice(&value[..size as usize]);
}
#[derive(Clone, Copy)]
struct TypedReloc {
target: ty::Id,
reloc: Reloc,
}
// TODO: make into bit struct (width: u2, sub_offset: u3, offset: u27)
#[derive(Clone, Copy, Debug)]
struct Reloc {
offset: Offset,
sub_offset: u8,
width: u8,
}
impl Reloc {
fn new(offset: usize, sub_offset: u8, width: u8) -> Self {
Self { offset: offset as u32, sub_offset, width }
}
fn apply_jump(mut self, code: &mut [u8], to: u32, from: u32) -> i64 {
self.offset += from;
let offset = to as i64 - self.offset as i64;
self.write_offset(code, offset);
offset
}
fn write_offset(&self, code: &mut [u8], offset: i64) {
let bytes = offset.to_ne_bytes();
let slice = &mut code[self.offset as usize + self.sub_offset as usize..];
slice[..self.width as usize].copy_from_slice(&bytes[..self.width as usize]);
}
}
struct FuncDt { struct FuncDt {
offset: Offset, offset: Offset,
@ -47,11 +94,10 @@ struct Assembler {
#[derive(Default)] #[derive(Default)]
pub struct HbvmBackend { pub struct HbvmBackend {
funcs: Vec<FuncDt>, funcs: EntVec<ty::Func, FuncDt>,
globals: Vec<GlobalDt>, globals: EntVec<ty::Global, GlobalDt>,
asm: Assembler, asm: Assembler,
ralloc: their_regalloc::Regalloc, ralloc: regalloc::Res,
ralloc_my: my_regalloc::Res,
ret_relocs: Vec<Reloc>, ret_relocs: Vec<Reloc>,
relocs: Vec<TypedReloc>, relocs: Vec<TypedReloc>,
@ -98,13 +144,13 @@ impl Backend for HbvmBackend {
debug_assert!(self.asm.funcs.is_empty()); debug_assert!(self.asm.funcs.is_empty());
debug_assert!(self.asm.globals.is_empty()); debug_assert!(self.asm.globals.is_empty());
self.globals.resize_with(types.ins.globals.len(), Default::default); self.globals.shadow(types.ins.globals.len());
self.asm.frontier.push(ty::Kind::Func(from).compress()); self.asm.frontier.push(from.into());
while let Some(itm) = self.asm.frontier.pop() { while let Some(itm) = self.asm.frontier.pop() {
match itm.expand() { match itm.expand() {
ty::Kind::Func(func) => { ty::Kind::Func(func) => {
let fuc = &mut self.funcs[func as usize]; let fuc = &mut self.funcs[func];
debug_assert!(!fuc.code.is_empty()); debug_assert!(!fuc.code.is_empty());
if fuc.offset != u32::MAX { if fuc.offset != u32::MAX {
continue; continue;
@ -114,7 +160,7 @@ impl Backend for HbvmBackend {
self.asm.frontier.extend(fuc.relocs.iter().map(|r| r.target)); self.asm.frontier.extend(fuc.relocs.iter().map(|r| r.target));
} }
ty::Kind::Global(glob) => { ty::Kind::Global(glob) => {
let glb = &mut self.globals[glob as usize]; let glb = &mut self.globals[glob];
if glb.offset != u32::MAX { if glb.offset != u32::MAX {
continue; continue;
} }
@ -128,7 +174,7 @@ impl Backend for HbvmBackend {
let init_len = to.len(); let init_len = to.len();
for &func in &self.asm.funcs { for &func in &self.asm.funcs {
let fuc = &mut self.funcs[func as usize]; let fuc = &mut self.funcs[func];
fuc.offset = to.len() as _; fuc.offset = to.len() as _;
debug_assert!(!fuc.code.is_empty()); debug_assert!(!fuc.code.is_empty());
to.extend(&fuc.code); to.extend(&fuc.code);
@ -137,18 +183,18 @@ impl Backend for HbvmBackend {
let code_length = to.len() - init_len; let code_length = to.len() - init_len;
for global in self.asm.globals.drain(..) { for global in self.asm.globals.drain(..) {
self.globals[global as usize].offset = to.len() as _; self.globals[global].offset = to.len() as _;
to.extend(&types.ins.globals[global as usize].data); to.extend(&types.ins.globals[global].data);
} }
let data_length = to.len() - code_length - init_len; let data_length = to.len() - code_length - init_len;
for func in self.asm.funcs.drain(..) { for func in self.asm.funcs.drain(..) {
let fuc = &self.funcs[func as usize]; let fuc = &self.funcs[func];
for rel in &fuc.relocs { for rel in &fuc.relocs {
let offset = match rel.target.expand() { let offset = match rel.target.expand() {
ty::Kind::Func(fun) => self.funcs[fun as usize].offset, ty::Kind::Func(fun) => self.funcs[fun].offset,
ty::Kind::Global(glo) => self.globals[glo as usize].offset, ty::Kind::Global(glo) => self.globals[glo].offset,
_ => unreachable!(), _ => unreachable!(),
}; };
rel.reloc.apply_jump(to, offset, fuc.offset); rel.reloc.apply_jump(to, offset, fuc.offset);
@ -158,7 +204,7 @@ impl Backend for HbvmBackend {
AssemblySpec { AssemblySpec {
code_length: code_length as _, code_length: code_length as _,
data_length: data_length as _, data_length: data_length as _,
entry: self.funcs[from as usize].offset, entry: self.funcs[from].offset,
} }
} }
@ -167,7 +213,7 @@ impl Backend for HbvmBackend {
mut sluce: &[u8], mut sluce: &[u8],
eca_handler: &mut dyn FnMut(&mut &[u8]), eca_handler: &mut dyn FnMut(&mut &[u8]),
types: &'a Types, types: &'a Types,
files: &'a [parser::Ast], files: &'a EntSlice<Module, parser::Ast>,
output: &mut String, output: &mut String,
) -> Result<(), hbbytecode::DisasmError<'a>> { ) -> Result<(), hbbytecode::DisasmError<'a>> {
use hbbytecode::DisasmItem; use hbbytecode::DisasmItem;
@ -175,11 +221,11 @@ impl Backend for HbvmBackend {
.ins .ins
.funcs .funcs
.iter() .iter()
.zip(&self.funcs) .zip(self.funcs.iter())
.filter(|(_, f)| f.offset != u32::MAX) .filter(|(_, f)| f.offset != u32::MAX)
.map(|(f, fd)| { .map(|(f, fd)| {
let name = if f.file != u32::MAX { let name = if f.file != Module::default() {
let file = &files[f.file as usize]; let file = &files[f.file];
file.ident_str(f.name) file.ident_str(f.name)
} else { } else {
"target_fn" "target_fn"
@ -191,13 +237,13 @@ impl Backend for HbvmBackend {
.ins .ins
.globals .globals
.iter() .iter()
.zip(&self.globals) .zip(self.globals.iter())
.filter(|(_, g)| g.offset != u32::MAX) .filter(|(_, g)| g.offset != u32::MAX)
.map(|(g, gd)| { .map(|(g, gd)| {
let name = if g.file == u32::MAX { let name = if g.file == Module::default() {
core::str::from_utf8(&g.data).unwrap_or("invalid utf-8") core::str::from_utf8(&g.data).unwrap_or("invalid utf-8")
} else { } else {
let file = &files[g.file as usize]; let file = &files[g.file];
file.ident_str(g.name) file.ident_str(g.name)
}; };
(gd.offset, (name, g.data.len() as Size, DisasmItem::Global)) (gd.offset, (name, g.data.len() as Size, DisasmItem::Global))
@ -210,32 +256,43 @@ impl Backend for HbvmBackend {
fn emit_ct_body( fn emit_ct_body(
&mut self, &mut self,
id: ty::Func, id: ty::Func,
nodes: &mut Nodes, nodes: &Nodes,
tys: &Types, tys: &Types,
files: &[parser::Ast], files: &EntSlice<Module, parser::Ast>,
) { ) {
self.emit_body(id, nodes, tys, files); self.emit_body(id, nodes, tys, files);
let fd = &mut self.funcs[id as usize]; let fd = &mut self.funcs[id];
fd.code.truncate(fd.code.len() - instrs::jala(0, 0, 0).0); fd.code.truncate(fd.code.len() - instrs::jala(0, 0, 0).0);
emit(&mut fd.code, instrs::tx()); emit(&mut fd.code, instrs::tx());
} }
fn emit_body(&mut self, id: ty::Func, nodes: &mut Nodes, tys: &Types, files: &[parser::Ast]) { fn emit_body(
let sig = tys.ins.funcs[id as usize].sig.unwrap(); &mut self,
id: ty::Func,
nodes: &Nodes,
tys: &Types,
files: &EntSlice<Module, parser::Ast>,
) {
let sig = tys.ins.funcs[id].sig;
debug_assert!(self.code.is_empty()); debug_assert!(self.code.is_empty());
self.offsets.clear(); self.offsets.clear();
self.offsets.resize(nodes.values.len(), Offset::MAX); self.offsets.resize(nodes.len(), Offset::MAX);
let mut stack_size = 0; let mut stack_size = 0;
'_compute_stack: { '_compute_stack: {
let mems = mem::take(&mut nodes[MEM].outputs); let mems = &nodes[MEM].outputs;
for &stck in mems.iter() { for &stck in mems.iter() {
if !matches!(nodes[stck].kind, Kind::Stck | Kind::Arg) { if !matches!(nodes[stck].kind, Kind::Stck | Kind::Arg) {
debug_assert_matches!( debug_assert_matches!(
nodes[stck].kind, nodes[stck].kind,
Kind::Phi | Kind::Return | Kind::Load | Kind::Call { .. } | Kind::Stre Kind::Phi
| Kind::Return { .. }
| Kind::Load
| Kind::Call { .. }
| Kind::Stre
| Kind::Join
); );
continue; continue;
} }
@ -248,27 +305,23 @@ impl Backend for HbvmBackend {
} }
self.offsets[stck as usize] = stack_size - self.offsets[stck as usize]; self.offsets[stck as usize] = stack_size - self.offsets[stck as usize];
} }
nodes[MEM].outputs = mems;
} }
let (saved, tail) = self.emit_body_code(nodes, sig, tys, files); let (saved, tail) = self.emit_body_code(nodes, sig, tys, files);
//let (saved, tail) = self.emit_body_code_my(nodes, sig, tys, files);
if let Some(last_ret) = self.ret_relocs.last() if let Some(last_ret) = self.ret_relocs.last()
&& last_ret.offset as usize == self.code.len() - 5 && last_ret.offset as usize == self.code.len() - 5
&& self && self
.jump_relocs .jump_relocs
.last() .last()
.map_or(true, |&(r, _)| self.offsets[r as usize] as usize != self.code.len()) .is_none_or(|&(r, _)| self.offsets[r as usize] as usize != self.code.len())
{ {
self.code.truncate(self.code.len() - 5); self.code.truncate(self.code.len() - 5);
self.ret_relocs.pop(); self.ret_relocs.pop();
} }
// FIXME: maybe do this incrementally
for (nd, rel) in self.jump_relocs.drain(..) { for (nd, rel) in self.jump_relocs.drain(..) {
let offset = self.offsets[nd as usize]; let offset = self.offsets[nd as usize];
//debug_assert!(offset < self.code.len() as u32 - 1);
rel.apply_jump(&mut self.code, offset, 0); rel.apply_jump(&mut self.code, offset, 0);
} }
@ -319,11 +372,9 @@ impl Backend for HbvmBackend {
self.emit(instrs::jala(reg::ZERO, reg::RET_ADDR, 0)); self.emit(instrs::jala(reg::ZERO, reg::RET_ADDR, 0));
} }
if self.funcs.get(id as usize).is_none() { self.funcs.shadow(tys.ins.funcs.len());
self.funcs.resize_with(id as usize + 1, Default::default); self.funcs[id].code = mem::take(&mut self.code);
} self.funcs[id].relocs = mem::take(&mut self.relocs);
self.funcs[id as usize].code = mem::take(&mut self.code);
self.funcs[id as usize].relocs = mem::take(&mut self.relocs);
debug_assert_eq!(self.ret_relocs.len(), 0); debug_assert_eq!(self.ret_relocs.len(), 0);
debug_assert_eq!(self.relocs.len(), 0); debug_assert_eq!(self.relocs.len(), 0);
@ -333,27 +384,52 @@ impl Backend for HbvmBackend {
} }
impl Nodes { impl Nodes {
fn cond_op(&self, cnd: Nid) -> CondRet {
let Kind::BinOp { op } = self[cnd].kind else { return None };
if self.is_unlocked(cnd) {
return None;
}
op.cond_op(self[self[cnd].inputs[1]].ty)
}
fn strip_offset(&self, region: Nid) -> (Nid, Offset) {
if matches!(self[region].kind, Kind::BinOp { op: TokenKind::Add | TokenKind::Sub })
&& self.is_locked(region)
&& let Kind::CInt { value } = self[self[region].inputs[2]].kind
{
(self[region].inputs[1], value as _)
} else {
(region, 0)
}
}
fn is_never_used(&self, nid: Nid, tys: &Types) -> bool { fn is_never_used(&self, nid: Nid, tys: &Types) -> bool {
let node = &self[nid]; let node = &self[nid];
match node.kind { match node.kind {
Kind::CInt { .. } => node.outputs.iter().all(|&o| { Kind::CInt { value: 0 } => false,
Kind::CInt { value: 1.. } => node.outputs.iter().all(|&o| {
matches!(self[o].kind, Kind::BinOp { op } matches!(self[o].kind, Kind::BinOp { op }
if op.imm_binop(self[o].ty).is_some() if op.imm_binop(self[o].ty).is_some()
&& self.is_const(self[o].inputs[2]) && self.is_const(self[o].inputs[2])
&& op.cond_op(self[o].ty).is_none()) && op.cond_op(self[o].ty).is_none())
}), }),
Kind::BinOp { op: TokenKind::Mul } if node.ty.is_float() => {
node.outputs.iter().all(|&n| {
self[n].kind == Kind::BinOp { op: TokenKind::Add } && self[n].inputs[1] == nid
})
}
Kind::BinOp { op: TokenKind::Add | TokenKind::Sub } => { Kind::BinOp { op: TokenKind::Add | TokenKind::Sub } => {
self[node.inputs[1]].lock_rc != 0 (self.is_locked(node.inputs[1]) && !self[node.inputs[1]].ty.is_float())
|| (self.is_const(node.inputs[2]) || (self.is_const(node.inputs[2])
&& node.outputs.iter().all(|&n| self[n].uses_direct_offset_of(nid, tys))) && node.outputs.iter().all(|&n| self.uses_direct_offset_of(n, nid, tys)))
} }
Kind::BinOp { op } => { Kind::BinOp { op } => {
op.cond_op(node.ty).is_some() op.cond_op(self[node.inputs[1]].ty).is_some()
&& node.outputs.iter().all(|&n| self[n].kind == Kind::If) && node.outputs.iter().all(|&n| self[n].kind == Kind::If)
} }
Kind::Stck if tys.size_of(node.ty) == 0 => true, Kind::Stck if tys.size_of(node.ty) == 0 => true,
Kind::Stck | Kind::Arg => node.outputs.iter().all(|&n| { Kind::Stck | Kind::Arg => node.outputs.iter().all(|&n| {
self[n].uses_direct_offset_of(nid, tys) self.uses_direct_offset_of(n, nid, tys)
|| (matches!(self[n].kind, Kind::BinOp { op: TokenKind::Add }) || (matches!(self[n].kind, Kind::BinOp { op: TokenKind::Add })
&& self.is_never_used(n, tys)) && self.is_never_used(n, tys))
}), }),
@ -361,22 +437,59 @@ impl Nodes {
_ => false, _ => false,
} }
} }
fn uses_direct_offset_of(&self, user: Nid, target: Nid, tys: &Types) -> bool {
let node = &self[user];
((node.kind == Kind::Stre && node.inputs[2] == target)
|| (node.kind == Kind::Load && node.inputs[1] == target))
&& (node.ty.loc(tys) == Loc::Reg
// this means the struct is actually loaded into a register so no BMC needed
|| (node.kind == Kind::Load
&& !matches!(tys.parama(node.ty).0, Some(PLoc::Ref(..)))
&& node.outputs.iter().all(|&o| matches!(self[o].kind, Kind::Call { .. } | Kind::Return { .. }))))
}
} }
impl Node { impl HbvmBackend {
fn uses_direct_offset_of(&self, nid: Nid, tys: &Types) -> bool { fn extend(
((self.kind == Kind::Stre && self.inputs[2] == nid) &mut self,
|| (self.kind == Kind::Load && self.inputs[1] == nid)) base: ty::Id,
&& self.ty.loc(tys) == Loc::Reg dest: ty::Id,
reg: Reg,
tys: &Types,
files: &EntSlice<Module, parser::Ast>,
) {
if reg == 0 {
return;
}
let (bsize, dsize) = (tys.size_of(base), tys.size_of(dest));
debug_assert!(bsize <= 8, "{}", ty::Display::new(tys, files, base));
debug_assert!(dsize <= 8, "{}", ty::Display::new(tys, files, dest));
if bsize == dsize {
return Default::default();
}
self.emit(match (base.is_signed(), dest.is_signed()) {
(true, true) => {
let op = [instrs::sxt8, instrs::sxt16, instrs::sxt32][bsize.ilog2() as usize];
op(reg, reg)
}
_ => {
let mask = (1u64 << (bsize * 8)) - 1;
instrs::andi(reg, reg, mask)
}
});
} }
} }
type CondRet = Option<(fn(u8, u8, i16) -> EncodedInstr, bool)>;
impl TokenKind { impl TokenKind {
fn cmp_against(self) -> Option<u64> { fn cmp_against(self) -> Option<u64> {
Some(match self { Some(match self {
TokenKind::Le | TokenKind::Gt => 1, Self::Le | Self::Gt => 1,
TokenKind::Ne | TokenKind::Eq => 0, Self::Ne | Self::Eq => 0,
TokenKind::Ge | TokenKind::Lt => (-1i64) as _, Self::Ge | Self::Lt => (-1i64) as _,
_ => return None, _ => return None,
}) })
} }
@ -388,22 +501,21 @@ impl TokenKind {
let size = ty.simple_size().unwrap(); let size = ty.simple_size().unwrap();
let ops = match self { let ops = match self {
TokenKind::Gt => [instrs::fcmpgt32, instrs::fcmpgt64], Self::Gt => [instrs::fcmpgt32, instrs::fcmpgt64],
TokenKind::Lt => [instrs::fcmplt32, instrs::fcmplt64], Self::Lt => [instrs::fcmplt32, instrs::fcmplt64],
_ => return None, _ => return None,
}; };
Some(ops[size.ilog2() as usize - 2]) Some(ops[size.ilog2() as usize - 2])
} }
#[expect(clippy::type_complexity)] fn cond_op(self, ty: ty::Id) -> CondRet {
fn cond_op(self, ty: ty::Id) -> Option<(fn(u8, u8, i16) -> EncodedInstr, bool)> {
if ty.is_float() {
return None;
}
let signed = ty.is_signed(); let signed = ty.is_signed();
Some(( Some((
match self { match self {
Self::Eq => instrs::jne,
Self::Ne => instrs::jeq,
_ if ty.is_float() => return None,
Self::Le if signed => instrs::jgts, Self::Le if signed => instrs::jgts,
Self::Le => instrs::jgtu, Self::Le => instrs::jgtu,
Self::Lt if signed => instrs::jlts, Self::Lt if signed => instrs::jlts,
@ -412,16 +524,14 @@ impl TokenKind {
Self::Ge => instrs::jltu, Self::Ge => instrs::jltu,
Self::Gt if signed => instrs::jgts, Self::Gt if signed => instrs::jgts,
Self::Gt => instrs::jgtu, Self::Gt => instrs::jgtu,
Self::Eq => instrs::jne,
Self::Ne => instrs::jeq,
_ => return None, _ => return None,
}, },
matches!(self, Self::Lt | TokenKind::Gt), matches!(self, Self::Lt | Self::Gt),
)) ))
} }
fn binop(self, ty: ty::Id) -> Option<fn(u8, u8, u8) -> EncodedInstr> { fn binop(self, ty: ty::Id) -> Option<fn(u8, u8, u8) -> EncodedInstr> {
let size = ty.simple_size().unwrap(); let size = ty.simple_size().unwrap_or_else(|| panic!("{:?}", ty.expand()));
if ty.is_integer() || ty == ty::Id::BOOL || ty.is_pointer() { if ty.is_integer() || ty == ty::Id::BOOL || ty.is_pointer() {
macro_rules! div { ($($op:ident),*) => {[$(|a, b, c| $op(a, 0, b, c)),*]}; } macro_rules! div { ($($op:ident),*) => {[$(|a, b, c| $op(a, 0, b, c)),*]}; }
macro_rules! rem { ($($op:ident),*) => {[$(|a, b, c| $op(0, a, b, c)),*]}; } macro_rules! rem { ($($op:ident),*) => {[$(|a, b, c| $op(0, a, b, c)),*]}; }
@ -486,7 +596,7 @@ impl TokenKind {
Self::Band => return Some(andi), Self::Band => return Some(andi),
Self::Bor => return Some(ori), Self::Bor => return Some(ori),
Self::Xor => return Some(xori), Self::Xor => return Some(xori),
Self::Shr if signed => basic_op!(srui8, srui16, srui32, srui64), Self::Shr if signed => basic_op!(srsi8, srsi16, srsi32, srsi64),
Self::Shr => basic_op!(srui8, srui16, srui32, srui64), Self::Shr => basic_op!(srui8, srui16, srui32, srui64),
Self::Shl => basic_op!(slui8, slui16, slui32, slui64), Self::Shl => basic_op!(slui8, slui16, slui32, slui64),
_ => return None, _ => return None,
@ -496,25 +606,84 @@ impl TokenKind {
Some(ops[size.ilog2() as usize]) Some(ops[size.ilog2() as usize])
} }
fn unop(&self, dst: ty::Id, src: ty::Id) -> Option<fn(u8, u8) -> EncodedInstr> { fn unop(&self, dst: ty::Id, src: ty::Id, tys: &Types) -> Option<fn(u8, u8) -> EncodedInstr> {
let src_idx = src.simple_size().unwrap().ilog2() as usize - 2; let src_idx = tys.size_of(src).ilog2() as usize;
Some(match self { Some(match self {
Self::Sub => instrs::neg, Self::Sub => [
|a, b| sub8(a, reg::ZERO, b),
|a, b| sub16(a, reg::ZERO, b),
|a, b| sub32(a, reg::ZERO, b),
|a, b| sub64(a, reg::ZERO, b),
][src_idx],
Self::Not => instrs::not,
Self::Float if dst.is_float() && src.is_integer() => { Self::Float if dst.is_float() && src.is_integer() => {
debug_assert_eq!(dst.simple_size(), src.simple_size()); debug_assert_matches!(
[instrs::itf32, instrs::itf64][src_idx] (dst.simple_size(), src.simple_size()),
(Some(4 | 8), Some(8))
);
[instrs::itf32, instrs::itf64][dst.simple_size().unwrap().ilog2() as usize - 2]
} }
Self::Number if src.is_float() && dst.is_integer() => { Self::Number if src.is_float() && dst.is_integer() => {
[|a, b| instrs::fti32(a, b, 1), |a, b| instrs::fti64(a, b, 1)][src_idx] [|a, b| instrs::fti32(a, b, 1), |a, b| instrs::fti64(a, b, 1)][src_idx - 2]
}
Self::Number if src.is_signed() && (dst.is_integer() || dst.is_pointer()) => {
[instrs::sxt8, instrs::sxt16, instrs::sxt32][src_idx]
}
Self::Number
if (src.is_unsigned() || src == ty::Id::BOOL)
&& (dst.is_integer() || dst.is_pointer()) =>
{
[
|a, b| instrs::andi(a, b, 0xff),
|a, b| instrs::andi(a, b, 0xffff),
|a, b| instrs::andi(a, b, 0xffffffff),
][src_idx]
} }
Self::Float if dst.is_float() && src.is_float() => { Self::Float if dst.is_float() && src.is_float() => {
[instrs::fc32t64, |a, b| instrs::fc64t32(a, b, 1)][src_idx] [instrs::fc32t64, |a, b| instrs::fc64t32(a, b, 1)][src_idx - 2]
} }
_ => return None, _ => return None,
}) })
} }
} }
#[derive(Clone, Copy, Debug)]
enum PLoc {
Reg(Reg, u16),
WideReg(Reg, u16),
Ref(Reg, u32),
}
impl PLoc {
fn reg(self) -> u8 {
match self {
PLoc::Reg(r, _) | PLoc::WideReg(r, _) | PLoc::Ref(r, _) => r,
}
}
}
struct ParamAlloc(Range<Reg>);
impl ParamAlloc {
pub fn next(&mut self, ty: ty::Id, tys: &Types) -> Option<PLoc> {
Some(match tys.size_of(ty) {
0 => return None,
size @ 1..=8 => PLoc::Reg(self.0.next().unwrap(), size as _),
size @ 9..=16 => PLoc::WideReg(self.0.next_chunk::<2>().unwrap()[0], size as _),
size @ 17.. => PLoc::Ref(self.0.next().unwrap(), size),
})
}
}
impl Types {
fn parama(&self, ret: ty::Id) -> (Option<PLoc>, ParamAlloc) {
let mut iter = ParamAlloc(1..12);
let ret = iter.next(ret, self);
iter.0.start += ret.is_none() as u8;
(ret, iter)
}
}
type EncodedInstr = (usize, [u8; instrs::MAX_SIZE]); type EncodedInstr = (usize, [u8; instrs::MAX_SIZE]);
fn emit(out: &mut Vec<u8>, (len, instr): EncodedInstr) { fn emit(out: &mut Vec<u8>, (len, instr): EncodedInstr) {
out.extend_from_slice(&instr[..len]); out.extend_from_slice(&instr[..len]);
@ -528,42 +697,7 @@ fn binary_prelude(to: &mut Vec<u8>) {
#[derive(Default)] #[derive(Default)]
pub struct LoggedMem { pub struct LoggedMem {
pub mem: hbvm::mem::HostMemory, pub mem: hbvm::mem::HostMemory,
op_buf: Vec<hbbytecode::Oper>, logger: hbvm::mem::InstrLogger,
disp_buf: String,
prev_instr: Option<hbbytecode::Instr>,
}
impl LoggedMem {
unsafe fn display_instr<T>(&mut self, instr: hbbytecode::Instr, addr: hbvm::mem::Address) {
let novm: *const hbvm::Vm<Self, 0> = core::ptr::null();
let offset = core::ptr::addr_of!((*novm).memory) as usize;
let regs = unsafe {
&*core::ptr::addr_of!(
(*(((self as *mut _ as *mut u8).sub(offset)) as *const hbvm::Vm<Self, 0>))
.registers
)
};
let mut bytes = core::slice::from_raw_parts(
(addr.get() - 1) as *const u8,
core::mem::size_of::<T>() + 1,
);
use core::fmt::Write;
hbbytecode::parse_args(&mut bytes, instr, &mut self.op_buf).unwrap();
debug_assert!(bytes.is_empty());
self.disp_buf.clear();
write!(self.disp_buf, "{:<10}", format!("{instr:?}")).unwrap();
for (i, op) in self.op_buf.drain(..).enumerate() {
if i != 0 {
write!(self.disp_buf, ", ").unwrap();
}
write!(self.disp_buf, "{op:?}").unwrap();
if let hbbytecode::Oper::R(r) = op {
write!(self.disp_buf, "({})", regs[r as usize].0).unwrap()
}
}
log::trace!("read-typed: {:x}: {}", addr.get(), self.disp_buf);
}
} }
impl hbvm::mem::Memory for LoggedMem { impl hbvm::mem::Memory for LoggedMem {
@ -596,19 +730,13 @@ impl hbvm::mem::Memory for LoggedMem {
} }
unsafe fn prog_read<T: Copy + 'static>(&mut self, addr: hbvm::mem::Address) -> T { unsafe fn prog_read<T: Copy + 'static>(&mut self, addr: hbvm::mem::Address) -> T {
if log::log_enabled!(log::Level::Trace) { self.mem.prog_read(addr)
if core::any::TypeId::of::<u8>() == core::any::TypeId::of::<T>() {
if let Some(instr) = self.prev_instr {
self.display_instr::<()>(instr, addr);
}
self.prev_instr = hbbytecode::Instr::try_from(*(addr.get() as *const u8)).ok();
} else {
let instr = self.prev_instr.take().unwrap();
self.display_instr::<T>(instr, addr);
}
} }
self.mem.prog_read(addr) fn log_instr(&mut self, at: hbvm::mem::Address, regs: &[hbvm::value::Value]) {
log::trace!("read-typed: {:x}: {}", at.get(), unsafe {
self.logger.display_instr(at, regs)
});
} }
} }
@ -715,7 +843,7 @@ pub struct AbleOsExecutableHeader {
#[cfg(test)] #[cfg(test)]
pub fn test_run_vm(out: &[u8], output: &mut String) { pub fn test_run_vm(out: &[u8], output: &mut String) {
use core::fmt::Write; use core::{ffi::CStr, fmt::Write};
let mut stack = [0_u64; 1024 * 20]; let mut stack = [0_u64; 1024 * 20];
@ -732,6 +860,12 @@ pub fn test_run_vm(out: &[u8], output: &mut String) {
match vm.run() { match vm.run() {
Ok(hbvm::VmRunOk::End) => break Ok(()), Ok(hbvm::VmRunOk::End) => break Ok(()),
Ok(hbvm::VmRunOk::Ecall) => match vm.read_reg(2).0 { Ok(hbvm::VmRunOk::Ecall) => match vm.read_reg(2).0 {
37 => writeln!(
output,
"{}",
unsafe { CStr::from_ptr(vm.read_reg(3).0 as _) }.to_str().unwrap()
)
.unwrap(),
1 => writeln!(output, "ev: Ecall").unwrap(), // compatibility with a test 1 => writeln!(output, "ev: Ecall").unwrap(), // compatibility with a test
69 => { 69 => {
let [size, align] = [vm.read_reg(3).0 as usize, vm.read_reg(4).0 as usize]; let [size, align] = [vm.read_reg(3).0 as usize, vm.read_reg(4).0 as usize];

File diff suppressed because it is too large Load diff

View file

@ -1,9 +1,15 @@
use { use {
crate::{ crate::{
lexer::{self, Lexer, TokenKind}, lexer::{self, Lexer, TokenKind},
parser::{self, CommentOr, CtorField, Expr, Poser, Radix, StructField}, parser::{
self, CommentOr, CtorField, EnumField, Expr, FieldList, ListKind, Poser, Radix,
StructField, UnionField,
},
},
core::{
fmt::{self},
mem,
}, },
core::fmt::{self},
}; };
pub fn display_radix(radix: Radix, mut value: u64, buf: &mut [u8; 64]) -> &str { pub fn display_radix(radix: Radix, mut value: u64, buf: &mut [u8; 64]) -> &str {
@ -26,6 +32,71 @@ pub fn display_radix(radix: Radix, mut value: u64, buf: &mut [u8; 64]) -> &str {
unreachable!() unreachable!()
} }
#[repr(u8)]
enum TokenGroup {
Blank,
Comment,
Keyword,
Identifier,
Directive,
Number,
String,
Op,
Assign,
Paren,
Bracket,
Colon,
Comma,
Dot,
Ctor,
}
impl TokenKind {
fn to_higlight_group(self) -> TokenGroup {
use {TokenGroup as TG, TokenKind::*};
match self {
BSlash | Pound | Eof | Ct => TG::Blank,
Comment => TG::Comment,
Directive => TG::Directive,
Colon => TG::Colon,
Semi | Comma => TG::Comma,
Dot => TG::Dot,
Ctor | Arr | Tupl | TArrow | Range => TG::Ctor,
LParen | RParen => TG::Paren,
LBrace | RBrace | LBrack | RBrack => TG::Bracket,
Number | Float => TG::Number,
Under | CtIdent | Ident => TG::Identifier,
Tick | Tilde | Que | Not | Mod | Band | Bor | Xor | Mul | Add | Sub | Div | Shl
| Shr | Or | And | Lt | Gt | Eq | Le | Ge | Ne => TG::Op,
Decl | Assign | BorAss | XorAss | BandAss | AddAss | SubAss | MulAss | DivAss
| ModAss | ShrAss | ShlAss => TG::Assign,
DQuote | Quote => TG::String,
Slf | Defer | Return | If | Else | Loop | Break | Continue | Fn | Idk | Die
| Struct | Packed | True | False | Null | Match | Enum | Union | CtLoop => TG::Keyword,
}
}
}
pub fn get_token_kinds(mut source: &mut [u8]) -> usize {
let len = source.len();
loop {
let src = unsafe { core::str::from_utf8_unchecked(source) };
let mut token = lexer::Lexer::new(src).eat();
match token.kind {
TokenKind::Eof => break,
// ???
TokenKind::CtIdent | TokenKind::Directive => token.start -= 1,
_ => {}
}
let start = token.start as usize;
let end = token.end as usize;
source[..start].fill(0);
source[start..end].fill(token.kind.to_higlight_group() as u8);
source = &mut source[end..];
}
len
}
pub fn minify(source: &mut str) -> usize { pub fn minify(source: &mut str) -> usize {
fn needs_space(c: u8) -> bool { fn needs_space(c: u8) -> bool {
matches!(c, b'a'..=b'z' | b'A'..=b'Z' | b'0'..=b'9' | 127..) matches!(c, b'a'..=b'z' | b'A'..=b'Z' | b'0'..=b'9' | 127..)
@ -39,7 +110,7 @@ pub fn minify(source: &mut str) -> usize {
let mut token = lexer::Lexer::new(reader).eat(); let mut token = lexer::Lexer::new(reader).eat();
match token.kind { match token.kind {
TokenKind::Eof => break, TokenKind::Eof => break,
TokenKind::CtIdent | TokenKind::Directive => token.start -= 1, TokenKind::CtIdent | TokenKind::CtLoop | TokenKind::Directive => token.start -= 1,
_ => {} _ => {}
} }
@ -135,24 +206,30 @@ impl<'a> Formatter<'a> {
return f.write_str(end); return f.write_str(end);
} }
if !end.is_empty() {
writeln!(f)?; writeln!(f)?;
self.depth += 1; }
self.depth += !end.is_empty() as usize;
let mut already_indented = end.is_empty();
let res = (|| { let res = (|| {
for (i, stmt) in list.iter().enumerate() { for (i, stmt) in list.iter().enumerate() {
if !mem::take(&mut already_indented) {
for _ in 0..self.depth { for _ in 0..self.depth {
f.write_str("\t")?; f.write_str("\t")?;
} }
}
let add_sep = fmt(self, stmt, f)?; let add_sep = fmt(self, stmt, f)?;
if add_sep { if add_sep {
f.write_str(sep)?; f.write_str(sep)?;
} }
if let Some(expr) = list.get(i + 1) if let Some(expr) = list.get(i + 1)
&& let Some(rest) = self.source.get(expr.posi() as usize..) && let Some(prev) = self.source.get(..expr.posi() as usize)
{ {
if insert_needed_semicolon(rest) { if sep.is_empty() && prev.trim_end().ends_with(';') {
f.write_str(";")?; f.write_str(";")?;
} }
if preserve_newlines(&self.source[..expr.posi() as usize]) > 1 { if count_trailing_newlines(prev) > 1 {
f.write_str("\n")?; f.write_str("\n")?;
} }
} }
@ -162,12 +239,14 @@ impl<'a> Formatter<'a> {
} }
Ok(()) Ok(())
})(); })();
self.depth -= 1; self.depth -= !end.is_empty() as usize;
if !end.is_empty() {
for _ in 0..self.depth { for _ in 0..self.depth {
f.write_str("\t")?; f.write_str("\t")?;
} }
f.write_str(end)?; f.write_str(end)?;
}
res res
} }
@ -186,6 +265,32 @@ impl<'a> Formatter<'a> {
} }
} }
fn fmt_fields<F: core::fmt::Write, T: Poser + Copy>(
&mut self,
f: &mut F,
keyword: &str,
trailing_comma: bool,
fields: FieldList<T>,
fmt: impl Fn(&mut Self, &T, &mut F) -> Result<(), fmt::Error>,
) -> fmt::Result {
f.write_str(keyword)?;
f.write_str(" {")?;
self.fmt_list_low(f, trailing_comma, "}", ",", fields, |s, field, f| {
match field {
CommentOr::Or(Ok(field)) => fmt(s, field, f)?,
CommentOr::Or(Err(scope)) => {
s.fmt_list(f, true, "", "", scope, Self::fmt)?;
return Ok(false);
}
CommentOr::Comment { literal, .. } => {
f.write_str(literal)?;
f.write_str("\n")?;
}
}
Ok(field.or().is_some())
})
}
pub fn fmt<F: core::fmt::Write>(&mut self, expr: &Expr, f: &mut F) -> fmt::Result { pub fn fmt<F: core::fmt::Write>(&mut self, expr: &Expr, f: &mut F) -> fmt::Result {
macro_rules! impl_parenter { macro_rules! impl_parenter {
($($name:ident => $pat:pat,)*) => { ($($name:ident => $pat:pat,)*) => {
@ -202,11 +307,13 @@ impl<'a> Formatter<'a> {
} }
match *expr { match *expr {
Expr::Ct { value, .. } => { Expr::Defer { value, .. } => {
f.write_str("$: ")?; f.write_str("defer ")?;
self.fmt(value, f) self.fmt(value, f)
} }
Expr::Slf { .. } => f.write_str("Self"),
Expr::String { literal, .. } => f.write_str(literal), Expr::String { literal, .. } => f.write_str(literal),
Expr::Char { literal, .. } => f.write_str(literal),
Expr::Comment { literal, .. } => f.write_str(literal), Expr::Comment { literal, .. } => f.write_str(literal),
Expr::Mod { path, .. } => write!(f, "@use(\"{path}\")"), Expr::Mod { path, .. } => write!(f, "@use(\"{path}\")"),
Expr::Embed { path, .. } => write!(f, "@embed(\"{path}\")"), Expr::Embed { path, .. } => write!(f, "@embed(\"{path}\")"),
@ -215,6 +322,16 @@ impl<'a> Formatter<'a> {
f.write_str(".")?; f.write_str(".")?;
f.write_str(field) f.write_str(field)
} }
Expr::Range { start, end, .. } => {
if let Some(start) = start {
self.fmt(start, f)?;
}
f.write_str("..")?;
if let Some(end) = end {
self.fmt(end, f)?;
}
Ok(())
}
Expr::Directive { name, args, .. } => { Expr::Directive { name, args, .. } => {
f.write_str("@")?; f.write_str("@")?;
f.write_str(name)?; f.write_str(name)?;
@ -226,25 +343,44 @@ impl<'a> Formatter<'a> {
f.write_str("packed ")?; f.write_str("packed ")?;
} }
write!(f, "struct {{")?; self.fmt_fields(
self.fmt_list_low(f, trailing_comma, "}", ",", fields, |s, field, f| { f,
match field { "struct",
CommentOr::Or(StructField { name, ty, .. }) => { trailing_comma,
fields,
|s, StructField { name, ty, default_value, .. }, f| {
f.write_str(name)?; f.write_str(name)?;
f.write_str(": ")?; f.write_str(": ")?;
s.fmt(ty, f)? s.fmt(ty, f)?;
if let Some(deva) = default_value {
f.write_str(" = ")?;
s.fmt(deva, f)?;
} }
CommentOr::Comment { literal, .. } => { Ok(())
f.write_str(literal)?; },
f.write_str("\n")?; )
}
}
Ok(field.or().is_some())
})
} }
Expr::Union { fields, trailing_comma, .. } => self.fmt_fields(
f,
"union",
trailing_comma,
fields,
|s, UnionField { name, ty, .. }, f| {
f.write_str(name)?;
f.write_str(": ")?;
s.fmt(ty, f)
},
),
Expr::Enum { variants, trailing_comma, .. } => self.fmt_fields(
f,
"enum",
trailing_comma,
variants,
|_, EnumField { name, .. }, f| f.write_str(name),
),
Expr::Ctor { ty, fields, trailing_comma, .. } => { Expr::Ctor { ty, fields, trailing_comma, .. } => {
if let Some(ty) = ty { if let Some(ty) = ty {
self.fmt_paren(ty, f, unary)?; self.fmt_paren(ty, f, postfix)?;
} }
f.write_str(".{")?; f.write_str(".{")?;
self.fmt_list( self.fmt_list(
@ -263,38 +399,43 @@ impl<'a> Formatter<'a> {
}, },
) )
} }
Expr::Tupl { Expr::List {
pos, pos,
kind: term,
ty: Some(&Expr::Slice { pos: spos, size: Some(&Expr::Number { value, .. }), item }), ty: Some(&Expr::Slice { pos: spos, size: Some(&Expr::Number { value, .. }), item }),
fields, fields,
trailing_comma, trailing_comma,
} if value as usize == fields.len() => self.fmt( } if value as usize == fields.len() => self.fmt(
&Expr::Tupl { &Expr::List {
pos, pos,
kind: term,
ty: Some(&Expr::Slice { pos: spos, size: None, item }), ty: Some(&Expr::Slice { pos: spos, size: None, item }),
fields, fields,
trailing_comma, trailing_comma,
}, },
f, f,
), ),
Expr::Tupl { ty, fields, trailing_comma, .. } => { Expr::List { ty, kind: term, fields, trailing_comma, .. } => {
if let Some(ty) = ty { if let Some(ty) = ty {
self.fmt_paren(ty, f, unary)?; self.fmt_paren(ty, f, postfix)?;
} }
f.write_str(".(")?; let (start, end) = match term {
self.fmt_list(f, trailing_comma, ")", ",", fields, Self::fmt) ListKind::Tuple => (".(", ")"),
ListKind::Array => (".[", "]"),
};
f.write_str(start)?;
self.fmt_list(f, trailing_comma, end, ",", fields, Self::fmt)
} }
Expr::Slice { item, size, .. } => { Expr::Slice { item, size, .. } => {
f.write_str("[")?; f.write_str("[")?;
self.fmt(item, f)?;
if let Some(size) = size { if let Some(size) = size {
f.write_str("; ")?;
self.fmt(size, f)?; self.fmt(size, f)?;
} }
f.write_str("]") f.write_str("]")?;
self.fmt_paren(item, f, unary)
} }
Expr::Index { base, index } => { Expr::Index { base, index } => {
self.fmt(base, f)?; self.fmt_paren(base, f, postfix)?;
f.write_str("[")?; f.write_str("[")?;
self.fmt(index, f)?; self.fmt(index, f)?;
f.write_str("]") f.write_str("]")
@ -316,8 +457,18 @@ impl<'a> Formatter<'a> {
} }
Ok(()) Ok(())
} }
Expr::Loop { body, .. } => { Expr::Match { value, branches, .. } => {
f.write_str("loop ")?; f.write_str("match ")?;
self.fmt(value, f)?;
f.write_str(" {")?;
self.fmt_list(f, true, "}", ",", branches, |s, br, f| {
s.fmt(&br.pat, f)?;
f.write_str(" => ")?;
s.fmt(&br.body, f)
})
}
Expr::Loop { body, unrolled, .. } => {
f.write_str(if unrolled { "$loop " } else { "loop " })?;
self.fmt(body, f) self.fmt(body, f)
} }
Expr::Closure { ret, body, args, .. } => { Expr::Closure { ret, body, args, .. } => {
@ -407,7 +558,7 @@ impl<'a> Formatter<'a> {
prev.rfind(|c: char| c.is_ascii_whitespace()).map_or(prev.len(), |i| i + 1); prev.rfind(|c: char| c.is_ascii_whitespace()).map_or(prev.len(), |i| i + 1);
let exact_bound = lexer::Lexer::new(&prev[estimate_bound..]).last().start; let exact_bound = lexer::Lexer::new(&prev[estimate_bound..]).last().start;
prev = &prev[..exact_bound as usize + estimate_bound]; prev = &prev[..exact_bound as usize + estimate_bound];
if preserve_newlines(prev) > 0 { if count_trailing_newlines(prev) > 0 {
f.write_str("\n")?; f.write_str("\n")?;
for _ in 0..self.depth + 1 { for _ in 0..self.depth + 1 {
f.write_str("\t")?; f.write_str("\t")?;
@ -415,7 +566,9 @@ impl<'a> Formatter<'a> {
f.write_str(op.name())?; f.write_str(op.name())?;
f.write_str(" ")?; f.write_str(" ")?;
} else { } else {
if op != TokenKind::Colon {
f.write_str(" ")?; f.write_str(" ")?;
}
f.write_str(op.name())?; f.write_str(op.name())?;
f.write_str(" ")?; f.write_str(" ")?;
} }
@ -430,15 +583,10 @@ impl<'a> Formatter<'a> {
} }
} }
pub fn preserve_newlines(source: &str) -> usize { pub fn count_trailing_newlines(source: &str) -> usize {
source[source.trim_end().len()..].bytes().filter(|&c| c == b'\n').count() source[source.trim_end().len()..].bytes().filter(|&c| c == b'\n').count()
} }
pub fn insert_needed_semicolon(source: &str) -> bool {
let kind = lexer::Lexer::new(source).eat().kind;
kind.precedence().is_some() || matches!(kind, TokenKind::Ctor | TokenKind::Tupl)
}
impl core::fmt::Display for parser::Ast { impl core::fmt::Display for parser::Ast {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result { fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
fmt_file(self.exprs(), &self.file, f) fmt_file(self.exprs(), &self.file, f)
@ -449,14 +597,14 @@ pub fn fmt_file(exprs: &[Expr], file: &str, f: &mut impl fmt::Write) -> fmt::Res
for (i, expr) in exprs.iter().enumerate() { for (i, expr) in exprs.iter().enumerate() {
Formatter::new(file).fmt(expr, f)?; Formatter::new(file).fmt(expr, f)?;
if let Some(expr) = exprs.get(i + 1) if let Some(expr) = exprs.get(i + 1)
&& let Some(rest) = file.get(expr.pos() as usize..) && let Some(prefix) = file.get(..expr.pos() as usize)
{ {
if insert_needed_semicolon(rest) { if prefix.trim_end().ends_with(';') {
write!(f, ";")?; f.write_str(";")?;
} }
if preserve_newlines(&file[..expr.pos() as usize]) > 1 { if count_trailing_newlines(prefix) > 1 {
writeln!(f)?; f.write_str("\n")?;
} }
} }
@ -482,15 +630,7 @@ pub mod test {
let mut ctx = Ctx::default(); let mut ctx = Ctx::default();
let ast = parser::Ast::new(ident, minned, &mut ctx, &mut parser::no_loader); let ast = parser::Ast::new(ident, minned, &mut ctx, &mut parser::no_loader);
//log::error!( log::info!("{}", ctx.errors.borrow());
// "{} / {} = {} | {} / {} = {}",
// ast.mem.size(),
// input.len(),
// ast.mem.size() as f32 / input.len() as f32,
// ast.mem.size(),
// ast.file.len(),
// ast.mem.size() as f32 / ast.file.len() as f32
//);
let mut output = String::new(); let mut output = String::new();
write!(output, "{ast}").unwrap(); write!(output, "{ast}").unwrap();

View file

@ -1,12 +1,15 @@
use { use {
crate::{ crate::{
parser::{self, Ast, Ctx, FileKind}, backend::hbvm::HbvmBackend,
son::{self, hbvm::HbvmBackend}, parser::{Ast, Ctx, FileKind},
son::{self},
ty, FnvBuildHasher,
}, },
alloc::{string::String, vec::Vec}, alloc::{string::String, vec::Vec},
core::{fmt::Write, num::NonZeroUsize, ops::Deref}, core::{fmt::Write, num::NonZeroUsize, ops::Deref},
hashbrown::hash_map, hashbrown::hash_map,
std::{ std::{
borrow::ToOwned,
collections::VecDeque, collections::VecDeque,
eprintln, eprintln,
ffi::OsStr, ffi::OsStr,
@ -17,6 +20,8 @@ use {
}, },
}; };
type HashMap<K, V> = hashbrown::HashMap<K, V, FnvBuildHasher>;
pub struct Logger; pub struct Logger;
impl log::Log for Logger { impl log::Log for Logger {
@ -33,19 +38,51 @@ impl log::Log for Logger {
fn flush(&self) {} fn flush(&self) {}
} }
pub const ABLEOS_PATH_RESOLVER: PathResolver =
&|mut path: &str, mut from: &str, tmp: &mut PathBuf| {
tmp.clear();
path = match path {
"stn" => {
from = "";
"./sysdata/libraries/stn/src/lib.hb"
}
_ => path,
};
match path.split_once(':') {
Some(("lib", p)) => tmp.extend(["./sysdata/libraries", p, "src/lib.hb"]),
Some(("stn", p)) => {
tmp.extend(["./sysdata/libraries/stn/src", &(p.to_owned() + ".hb")])
}
Some(("sysdata", p)) => tmp.extend(["./sysdata", p]),
None => match Path::new(from).parent() {
Some(parent) => tmp.extend([parent, Path::new(path)]),
None => tmp.push(path),
},
_ => panic!("path: '{path}' is invalid: unexpected ':'"),
};
tmp.canonicalize().map_err(|source| CantLoadFile { path: std::mem::take(tmp), source })
};
#[derive(Default)] #[derive(Default)]
pub struct Options { pub struct Options<'a> {
pub fmt: bool, pub fmt: bool,
pub fmt_stdout: bool, pub fmt_stdout: bool,
pub dump_asm: bool, pub dump_asm: bool,
pub extra_threads: usize, pub extra_threads: usize,
pub resolver: Option<PathResolver<'a>>,
} }
impl Options { impl<'a> Options<'a> {
pub fn from_args(args: &[&str]) -> std::io::Result<Self> { pub fn from_args(
args: &[&str],
out: &mut Vec<u8>,
resolvers: &'a [(&str, PathResolver)],
) -> std::io::Result<Self> {
if args.contains(&"--help") || args.contains(&"-h") { if args.contains(&"--help") || args.contains(&"-h") {
log::error!("Usage: hbc [OPTIONS...] <FILE>"); writeln!(out, "Usage: hbc [OPTIONS...] <FILE>")?;
log::error!(include_str!("../command-help.txt")); writeln!(out, include_str!("../command-help.txt"))?;
return Err(std::io::ErrorKind::Other.into()); return Err(std::io::ErrorKind::Other.into());
} }
@ -58,65 +95,91 @@ impl Options {
.position(|&a| a == "--threads") .position(|&a| a == "--threads")
.map(|i| { .map(|i| {
args[i + 1].parse::<NonZeroUsize>().map_err(|e| { args[i + 1].parse::<NonZeroUsize>().map_err(|e| {
std::io::Error::other(format!("--threads expects non zero integer: {e}")) writeln!(out, "--threads expects non zero integer: {e}")
.err()
.unwrap_or(std::io::ErrorKind::Other.into())
}) })
}) })
.transpose()? .transpose()?
.map_or(1, NonZeroUsize::get) .map_or(1, NonZeroUsize::get)
- 1, - 1,
resolver: args
.iter()
.position(|&a| a == "--path-resolver")
.map(|i| {
resolvers.iter().find(|&&(n, _)| args[i + 1] == n).map(|&(_, r)| r).ok_or_else(
|| {
writeln!(
out,
"--path-resolver can only be one of: {}",
resolvers
.iter()
.map(|&(n, _)| n)
.intersperse(", ")
.collect::<String>()
)
.err()
.unwrap_or(std::io::ErrorKind::Other.into())
},
)
})
.transpose()?,
}) })
} }
} }
pub fn run_compiler(root_file: &str, options: Options, out: &mut Vec<u8>) -> std::io::Result<()> { pub fn run_compiler(
let parsed = parse_from_fs(options.extra_threads, root_file)?; root_file: &str,
options: Options,
out: &mut Vec<u8>,
warnings: &mut String,
) -> std::io::Result<()> {
let parsed = parse_from_fs(
options.extra_threads,
root_file,
options.resolver.unwrap_or(&default_resolve),
)?;
fn format_ast(ast: parser::Ast) -> std::io::Result<()> { if (options.fmt || options.fmt_stdout) && !parsed.errors.is_empty() {
let mut output = String::new(); *out = parsed.errors.into_bytes();
write!(output, "{ast}").unwrap(); return Err(std::io::Error::other("fmt fialed (errors are in out)"));
if ast.file.deref().trim() != output.as_str().trim() {
std::fs::write(&*ast.path, output)?;
}
Ok(())
} }
if options.fmt { if options.fmt {
if !parsed.errors.is_empty() { let mut output = String::new();
*out = parsed.errors.into_bytes(); for ast in parsed.ast {
return Err(std::io::Error::other("parsing fialed")); write!(output, "{ast}").unwrap();
if ast.file.deref().trim() != output.as_str().trim() {
std::fs::write(&*ast.path, &output)?;
} }
output.clear();
for parsed in parsed.ast {
format_ast(parsed)?;
} }
} else if options.fmt_stdout { } else if options.fmt_stdout {
if !parsed.errors.is_empty() { write!(out, "{}", &parsed.ast[0])?;
*out = parsed.errors.into_bytes();
return Err(std::io::Error::other("parsing fialed"));
}
let ast = parsed.ast.into_iter().next().unwrap();
write!(out, "{ast}").unwrap();
} else { } else {
let mut backend = HbvmBackend::default(); let mut backend = HbvmBackend::default();
let mut ctx = crate::son::CodegenCtx::default(); let mut ctx = crate::son::CodegenCtx::default();
*ctx.parser.errors.get_mut() = parsed.errors; *ctx.parser.errors.get_mut() = parsed.errors;
let mut codegen = son::Codegen::new(&mut backend, &parsed.ast, &mut ctx); let mut codegen = son::Codegen::new(&mut backend, &parsed.ast, &mut ctx);
codegen.push_embeds(parsed.embeds); codegen.push_embeds(parsed.embeds);
codegen.generate(0); codegen.generate(ty::Module::MAIN);
*warnings = core::mem::take(&mut *codegen.warnings.borrow_mut());
if !codegen.errors.borrow().is_empty() { if !codegen.errors.borrow().is_empty() {
log::error!("{}", codegen.errors.borrow()); drop(codegen);
return Err(std::io::Error::other("compilation faoled")); *out = ctx.parser.errors.into_inner().into_bytes();
return Err(std::io::Error::other("compilation faoled (errors are in out)"));
} }
codegen.assemble(out); codegen.assemble(out);
if options.dump_asm { if options.dump_asm {
let mut disasm = String::new(); let mut disasm = String::new();
codegen.disasm(&mut disasm, out).map_err(|e| io::Error::other(e.to_string()))?; let err = codegen.disasm(&mut disasm, out).map_err(|e| io::Error::other(e.to_string()));
*out = disasm.into_bytes(); *out = disasm.into_bytes();
err?
} }
} }
@ -214,8 +277,7 @@ pub struct Loaded {
errors: String, errors: String,
} }
pub fn parse_from_fs(extra_threads: usize, root: &str) -> io::Result<Loaded> { fn default_resolve(path: &str, from: &str, tmp: &mut PathBuf) -> Result<PathBuf, CantLoadFile> {
fn resolve(path: &str, from: &str, tmp: &mut PathBuf) -> Result<PathBuf, CantLoadFile> {
tmp.clear(); tmp.clear();
match Path::new(from).parent() { match Path::new(from).parent() {
Some(parent) => tmp.extend([parent, Path::new(path)]), Some(parent) => tmp.extend([parent, Path::new(path)]),
@ -225,10 +287,14 @@ pub fn parse_from_fs(extra_threads: usize, root: &str) -> io::Result<Loaded> {
tmp.canonicalize().map_err(|source| CantLoadFile { path: std::mem::take(tmp), source }) tmp.canonicalize().map_err(|source| CantLoadFile { path: std::mem::take(tmp), source })
} }
/// fn(path, from, tmp)
pub type PathResolver<'a> =
&'a (dyn Fn(&str, &str, &mut PathBuf) -> Result<PathBuf, CantLoadFile> + Send + Sync);
#[derive(Debug)] #[derive(Debug)]
struct CantLoadFile { pub struct CantLoadFile {
path: PathBuf, pub path: PathBuf,
source: io::Error, pub source: io::Error,
} }
impl core::fmt::Display for CantLoadFile { impl core::fmt::Display for CantLoadFile {
@ -249,10 +315,15 @@ pub fn parse_from_fs(extra_threads: usize, root: &str) -> io::Result<Loaded> {
} }
} }
type Task = (u32, PathBuf); pub fn parse_from_fs(
extra_threads: usize,
root: &str,
resolve: PathResolver,
) -> io::Result<Loaded> {
type Task = (usize, PathBuf);
let seen_modules = Mutex::new(crate::HashMap::<PathBuf, u32>::default()); let seen_modules = Mutex::new(HashMap::<PathBuf, usize>::default());
let seen_embeds = Mutex::new(crate::HashMap::<PathBuf, u32>::default()); let seen_embeds = Mutex::new(HashMap::<PathBuf, usize>::default());
let tasks = TaskQueue::<Task>::new(extra_threads + 1); let tasks = TaskQueue::<Task>::new(extra_threads + 1);
let ast = Mutex::new(Vec::<io::Result<Ast>>::new()); let ast = Mutex::new(Vec::<io::Result<Ast>>::new());
let embeds = Mutex::new(Vec::<Vec<u8>>::new()); let embeds = Mutex::new(Vec::<Vec<u8>>::new());
@ -271,7 +342,7 @@ pub fn parse_from_fs(extra_threads: usize, root: &str) -> io::Result<Loaded> {
} }
hash_map::Entry::Vacant(entry) => { hash_map::Entry::Vacant(entry) => {
physiscal_path = entry.insert_entry(len as _).key().clone(); physiscal_path = entry.insert_entry(len as _).key().clone();
len as u32 len
} }
} }
}; };
@ -296,7 +367,7 @@ pub fn parse_from_fs(extra_threads: usize, root: &str) -> io::Result<Loaded> {
} }
hash_map::Entry::Vacant(entry) => { hash_map::Entry::Vacant(entry) => {
physiscal_path = entry.insert_entry(len as _).key().clone(); physiscal_path = entry.insert_entry(len as _).key().clone();
len as u32 len
} }
} }
}; };
@ -311,10 +382,10 @@ pub fn parse_from_fs(extra_threads: usize, root: &str) -> io::Result<Loaded> {
) )
})?; })?;
let mut embeds = embeds.lock().unwrap(); let mut embeds = embeds.lock().unwrap();
if id as usize >= embeds.len() { if id >= embeds.len() {
embeds.resize(id as usize + 1, Default::default()); embeds.resize(id + 1, Default::default());
} }
embeds[id as usize] = content; embeds[id] = content;
Ok(id) Ok(id)
} }
} }
@ -338,9 +409,9 @@ pub fn parse_from_fs(extra_threads: usize, root: &str) -> io::Result<Loaded> {
while let Some(task @ (indx, ..)) = tasks.pop() { while let Some(task @ (indx, ..)) = tasks.pop() {
let res = execute_task(&mut ctx, task, &mut tmp); let res = execute_task(&mut ctx, task, &mut tmp);
let mut ast = ast.lock().unwrap(); let mut ast = ast.lock().unwrap();
let len = ast.len().max(indx as usize + 1); let len = ast.len().max(indx + 1);
ast.resize_with(len, || Err(io::ErrorKind::InvalidData.into())); ast.resize_with(len, || Err(io::ErrorKind::InvalidData.into()));
ast[indx as usize] = res; ast[indx] = res;
} }
ctx.errors.into_inner() ctx.errors.into_inner()
}; };

View file

@ -1,8 +1,10 @@
use { use {
crate::{ crate::{
backend::hbvm::HbvmBackend,
lexer::TokenKind, lexer::TokenKind,
parser, parser,
son::{hbvm::HbvmBackend, Codegen, CodegenCtx}, son::{Codegen, CodegenCtx},
ty::Module,
}, },
alloc::string::String, alloc::string::String,
core::{fmt::Write, hash::BuildHasher, ops::Range}, core::{fmt::Write, hash::BuildHasher, ops::Range},
@ -135,6 +137,6 @@ pub fn fuzz(seed_range: Range<u64>) {
let mut backend = HbvmBackend::default(); let mut backend = HbvmBackend::default();
let mut cdg = Codegen::new(&mut backend, core::slice::from_ref(&parsed), &mut ctx); let mut cdg = Codegen::new(&mut backend, core::slice::from_ref(&parsed), &mut ctx);
cdg.generate(0); cdg.generate(Module::MAIN);
} }
} }

View file

@ -32,6 +32,9 @@ macro_rules! gen_token_kind {
#[keywords] $( #[keywords] $(
$keyword:ident = $keyword_lit:literal, $keyword:ident = $keyword_lit:literal,
)* )*
#[const_keywords] $(
$const_keyword:ident = $const_keyword_lit:literal,
)*
#[punkt] $( #[punkt] $(
$punkt:ident = $punkt_lit:literal, $punkt:ident = $punkt_lit:literal,
)* )*
@ -56,6 +59,7 @@ macro_rules! gen_token_kind {
match *self { match *self {
$( Self::$pattern => concat!('<', stringify!($pattern), '>'), )* $( Self::$pattern => concat!('<', stringify!($pattern), '>'), )*
$( Self::$keyword => stringify!($keyword_lit), )* $( Self::$keyword => stringify!($keyword_lit), )*
$( Self::$const_keyword => concat!('$', $const_keyword_lit), )*
$( Self::$punkt => stringify!($punkt_lit), )* $( Self::$punkt => stringify!($punkt_lit), )*
$($( Self::$op => $op_lit, $($( Self::$op => $op_lit,
$(Self::$assign => concat!($op_lit, "="),)?)*)* $(Self::$assign => concat!($op_lit, "="),)?)*)*
@ -72,12 +76,23 @@ macro_rules! gen_token_kind {
} + 1) } + 1)
} }
#[allow(non_upper_case_globals)]
fn from_ident(ident: &[u8]) -> Self { fn from_ident(ident: &[u8]) -> Self {
$(const $keyword: &[u8] = $keyword_lit.as_bytes();)*
match ident { match ident {
$($keyword_lit => Self::$keyword,)* $($keyword => Self::$keyword,)*
_ => Self::Ident, _ => Self::Ident,
} }
} }
#[allow(non_upper_case_globals)]
fn from_ct_ident(ident: &[u8]) -> Self {
$(const $const_keyword: &[u8] = $const_keyword_lit.as_bytes();)*
match ident {
$($const_keyword => Self::$const_keyword,)*
_ => Self::CtIdent,
}
}
} }
}; };
} }
@ -121,23 +136,11 @@ pub enum TokenKind {
Ct, Ct,
Return,
If,
Else,
Loop,
Break,
Continue,
Fn,
Struct,
Packed,
True,
False,
Null,
Idk,
Die,
Ctor, Ctor,
Tupl, Tupl,
Arr,
TArrow,
Range,
Or, Or,
And, And,
@ -147,8 +150,31 @@ pub enum TokenKind {
BSlash = b'\\', BSlash = b'\\',
RBrack = b']', RBrack = b']',
Xor = b'^', Xor = b'^',
Tick = b'`',
Under = b'_', Under = b'_',
Tick = b'`',
Slf,
Return,
If,
Match,
Else,
Loop,
Break,
Continue,
Fn,
Struct,
Packed,
Enum,
Union,
True,
False,
Null,
Idk,
Die,
Defer,
CtLoop,
// Unused = a-z // Unused = a-z
LBrace = b'{', LBrace = b'{',
Bor = b'|', Bor = b'|',
@ -193,6 +219,10 @@ impl TokenKind {
matches!(self, S::Eq | S::Ne | S::Bor | S::Xor | S::Band | S::Add | S::Mul) matches!(self, S::Eq | S::Ne | S::Bor | S::Xor | S::Band | S::Add | S::Mul)
} }
pub fn is_compatison(self) -> bool {
matches!(self, Self::Lt | Self::Gt | Self::Ge | Self::Le | Self::Ne | Self::Eq)
}
pub fn is_supported_float_op(self) -> bool { pub fn is_supported_float_op(self) -> bool {
matches!( matches!(
self, self,
@ -263,12 +293,11 @@ impl TokenKind {
match self { match self {
Self::Sub if float => (-f64::from_bits(value as _)).to_bits() as _, Self::Sub if float => (-f64::from_bits(value as _)).to_bits() as _,
Self::Sub => value.wrapping_neg(), Self::Sub => value.wrapping_neg(),
Self::Not => (value == 0) as _,
Self::Float if float => value, Self::Float if float => value,
Self::Float => (value as f64).to_bits() as _, Self::Float => (value as f64).to_bits() as _,
Self::Number => { Self::Number if float => f64::from_bits(value as _) as _,
debug_assert!(float); Self::Number => value,
f64::from_bits(value as _).to_bits() as _
}
s => todo!("{s}"), s => todo!("{s}"),
} }
} }
@ -295,24 +324,34 @@ gen_token_kind! {
Eof, Eof,
Directive, Directive,
#[keywords] #[keywords]
Return = b"return", Slf = "Self",
If = b"if", Return = "return",
Else = b"else", If = "if",
Loop = b"loop", Match = "match",
Break = b"break", Else = "else",
Continue = b"continue", Loop = "loop",
Fn = b"fn", Break = "break",
Struct = b"struct", Continue = "continue",
Packed = b"packed", Fn = "fn",
True = b"true", Struct = "struct",
False = b"false", Packed = "packed",
Null = b"null", Enum = "enum",
Idk = b"idk", Union = "union",
Die = b"die", True = "true",
Under = b"_", False = "false",
Null = "null",
Idk = "idk",
Die = "die",
Defer = "defer",
Under = "_",
#[const_keywords]
CtLoop = "loop",
#[punkt] #[punkt]
Ctor = ".{", Ctor = ".{",
Tupl = ".(", Tupl = ".(",
Arr = ".[",
TArrow = "=>",
Range = "..",
// #define OP: each `#[prec]` delimeters a level of precedence from lowest to highest // #define OP: each `#[prec]` delimeters a level of precedence from lowest to highest
#[ops] #[ops]
#[prec] #[prec]
@ -391,6 +430,23 @@ impl<'a> Lexer<'a> {
unsafe { core::str::from_utf8_unchecked(&self.source[tok]) } unsafe { core::str::from_utf8_unchecked(&self.source[tok]) }
} }
pub fn taste(&self) -> Token {
Lexer { pos: self.pos, source: self.source }.eat()
}
fn peek_n<const N: usize>(&self) -> Option<&[u8; N]> {
if core::intrinsics::unlikely(self.pos as usize + N > self.source.len()) {
None
} else {
Some(unsafe {
self.source
.get_unchecked(self.pos as usize..self.pos as usize + N)
.first_chunk()
.unwrap_unchecked()
})
}
}
fn peek(&self) -> Option<u8> { fn peek(&self) -> Option<u8> {
if core::intrinsics::unlikely(self.pos >= self.source.len() as u32) { if core::intrinsics::unlikely(self.pos >= self.source.len() as u32) {
None None
@ -459,7 +515,11 @@ impl<'a> Lexer<'a> {
self.advance(); self.advance();
} }
if self.advance_if(b'.') { if self
.peek_n()
.map_or_else(|| self.peek() == Some(b'.'), |&[a, b]| a == b'.' && b != b'.')
{
self.pos += 1;
while let Some(b'0'..=b'9') = self.peek() { while let Some(b'0'..=b'9') = self.peek() {
self.advance(); self.advance();
} }
@ -511,14 +571,23 @@ impl<'a> Lexer<'a> {
} }
b'.' if self.advance_if(b'{') => T::Ctor, b'.' if self.advance_if(b'{') => T::Ctor,
b'.' if self.advance_if(b'(') => T::Tupl, b'.' if self.advance_if(b'(') => T::Tupl,
b'.' if self.advance_if(b'[') => T::Arr,
b'.' if self.advance_if(b'.') => T::Range,
b'=' if self.advance_if(b'>') => T::TArrow,
b'&' if self.advance_if(b'&') => T::And, b'&' if self.advance_if(b'&') => T::And,
b'|' if self.advance_if(b'|') => T::Or, b'|' if self.advance_if(b'|') => T::Or,
b'$' if self.advance_if(b':') => T::Ct, b'$' if self.advance_if(b':') => T::Ct,
b'@' | b'$' => { b'@' => {
start += 1; start += 1;
advance_ident(self); advance_ident(self);
identity(c) identity(c)
} }
b'$' => {
start += 1;
advance_ident(self);
let ident = &self.source[start as usize..self.pos as usize];
T::from_ct_ident(ident)
}
b'<' | b'>' if self.advance_if(c) => { b'<' | b'>' if self.advance_if(c) => {
identity(c - 5 + 128 * self.advance_if(b'=') as u8) identity(c - 5 + 128 * self.advance_if(b'=') as u8)
} }

File diff suppressed because it is too large Load diff

View file

@ -1,17 +1,31 @@
#[cfg(feature = "std")] #[cfg(feature = "std")]
fn main() -> std::io::Result<()> { fn main() {
use std::io::Write; use std::io::Write;
log::set_logger(&hblang::Logger).unwrap(); fn run(out: &mut Vec<u8>, warnings: &mut String) -> std::io::Result<()> {
log::set_max_level(log::LevelFilter::Info);
let args = std::env::args().collect::<Vec<_>>(); let args = std::env::args().collect::<Vec<_>>();
let args = args.iter().map(String::as_str).collect::<Vec<_>>(); let args = args.iter().map(String::as_str).collect::<Vec<_>>();
let resolvers = &[("ableos", hblang::ABLEOS_PATH_RESOLVER)];
let opts = hblang::Options::from_args(&args)?; let opts = hblang::Options::from_args(&args, out, resolvers)?;
let file = args.iter().filter(|a| !a.starts_with('-')).nth(1).copied().unwrap_or("main.hb"); let file = args.iter().filter(|a| !a.starts_with('-')).nth(1).copied().unwrap_or("main.hb");
let mut out = Vec::new(); hblang::run_compiler(file, opts, out, warnings)
hblang::run_compiler(file, opts, &mut out)?; }
std::io::stdout().write_all(&out)
log::set_logger(&hblang::fs::Logger).unwrap();
log::set_max_level(log::LevelFilter::Error);
let mut out = Vec::new();
let mut warnings = String::new();
match run(&mut out, &mut warnings) {
Ok(_) => {
std::io::stderr().write_all(warnings.as_bytes()).unwrap();
std::io::stdout().write_all(&out).unwrap()
}
Err(_) => {
std::io::stderr().write_all(warnings.as_bytes()).unwrap();
std::io::stderr().write_all(&out).unwrap();
std::process::exit(1);
}
}
} }

2253
lang/src/nodes.rs Normal file

File diff suppressed because it is too large Load diff

View file

@ -2,6 +2,8 @@ use {
crate::{ crate::{
fmt::Formatter, fmt::Formatter,
lexer::{self, Lexer, Token, TokenKind}, lexer::{self, Lexer, Token, TokenKind},
ty::{Global, Module},
utils::Ent as _,
Ident, Ident,
}, },
alloc::{boxed::Box, string::String, vec::Vec}, alloc::{boxed::Box, string::String, vec::Vec},
@ -19,10 +21,9 @@ use {
pub type Pos = u32; pub type Pos = u32;
pub type IdentFlags = u32; pub type IdentFlags = u32;
pub type FileId = u32;
pub type IdentIndex = u16; pub type IdentIndex = u16;
pub type LoaderError = String; pub type LoaderError = String;
pub type Loader<'a> = &'a mut (dyn FnMut(&str, &str, FileKind) -> Result<FileId, LoaderError> + 'a); pub type Loader<'a> = &'a mut (dyn FnMut(&str, &str, FileKind) -> Result<usize, LoaderError> + 'a);
#[derive(PartialEq, Eq, Debug)] #[derive(PartialEq, Eq, Debug)]
pub enum FileKind { pub enum FileKind {
@ -30,7 +31,7 @@ pub enum FileKind {
Embed, Embed,
} }
trait Trans { pub trait Trans {
fn trans(self) -> Self; fn trans(self) -> Self;
} }
@ -63,7 +64,7 @@ pub mod idfl {
} }
} }
pub fn no_loader(_: &str, _: &str, _: FileKind) -> Result<FileId, LoaderError> { pub fn no_loader(_: &str, _: &str, _: FileKind) -> Result<usize, LoaderError> {
Ok(0) Ok(0)
} }
@ -78,6 +79,8 @@ struct ScopeIdent {
ident: Ident, ident: Ident,
declared: bool, declared: bool,
ordered: bool, ordered: bool,
used: bool,
is_ct: bool,
flags: IdentFlags, flags: IdentFlags,
} }
@ -194,8 +197,8 @@ impl<'a, 'b> Parser<'a, 'b> {
fn declare_rec(&mut self, expr: &Expr, top_level: bool) { fn declare_rec(&mut self, expr: &Expr, top_level: bool) {
match *expr { match *expr {
Expr::Ident { pos, id, is_first, .. } => { Expr::Ident { pos, id, is_first, is_ct, .. } => {
self.declare(pos, id, !top_level, is_first || top_level) self.declare(pos, id, !top_level, is_first || top_level, is_ct)
} }
Expr::Ctor { fields, .. } => { Expr::Ctor { fields, .. } => {
for CtorField { value, .. } in fields { for CtorField { value, .. } in fields {
@ -206,7 +209,7 @@ impl<'a, 'b> Parser<'a, 'b> {
} }
} }
fn declare(&mut self, pos: Pos, id: Ident, ordered: bool, valid_order: bool) { fn declare(&mut self, pos: Pos, id: Ident, ordered: bool, valid_order: bool, is_ct: bool) {
if !valid_order { if !valid_order {
self.report( self.report(
pos, pos,
@ -228,7 +231,7 @@ impl<'a, 'b> Parser<'a, 'b> {
); );
return; return;
} }
self.ctx.idents[index].is_ct = is_ct;
self.ctx.idents[index].ordered = ordered; self.ctx.idents[index].ordered = ordered;
} }
@ -247,7 +250,10 @@ impl<'a, 'b> Parser<'a, 'b> {
.enumerate() .enumerate()
.rfind(|(_, elem)| self.lexer.slice(elem.ident.range()) == name) .rfind(|(_, elem)| self.lexer.slice(elem.ident.range()) == name)
{ {
Some((i, elem)) => (i, elem, false), Some((i, elem)) => {
elem.used = true;
(i, elem, false)
}
None => { None => {
let ident = match Ident::new(token.start, name.len() as _) { let ident = match Ident::new(token.start, name.len() as _) {
None => { None => {
@ -260,7 +266,9 @@ impl<'a, 'b> Parser<'a, 'b> {
self.ctx.idents.push(ScopeIdent { self.ctx.idents.push(ScopeIdent {
ident, ident,
declared: false, declared: false,
used: false,
ordered: false, ordered: false,
is_ct: false,
flags: 0, flags: 0,
}); });
(self.ctx.idents.len() - 1, self.ctx.idents.last_mut().unwrap(), true) (self.ctx.idents.len() - 1, self.ctx.idents.last_mut().unwrap(), true)
@ -270,7 +278,7 @@ impl<'a, 'b> Parser<'a, 'b> {
id.flags |= idfl::COMPTIME * is_ct as u32; id.flags |= idfl::COMPTIME * is_ct as u32;
if id.declared && id.ordered && self.ns_bound > i { if id.declared && id.ordered && self.ns_bound > i {
id.flags |= idfl::COMPTIME; id.flags |= idfl::COMPTIME;
self.ctx.captured.push(id.ident); self.ctx.captured.push(CapturedIdent { id: id.ident, is_ct: id.is_ct });
} }
(id.ident, bl) (id.ident, bl)
@ -281,13 +289,27 @@ impl<'a, 'b> Parser<'a, 'b> {
} }
fn unit_expr(&mut self) -> Option<Expr<'a>> { fn unit_expr(&mut self) -> Option<Expr<'a>> {
self.unit_expr_low(true)
}
fn unit_expr_low(&mut self, eat_tail: bool) -> Option<Expr<'a>> {
use {Expr as E, TokenKind as T}; use {Expr as E, TokenKind as T};
if matches!(
self.token.kind,
T::RParen | T::RBrace | T::RBrack | T::Comma | T::Semi | T::Else
) {
self.report(self.token.start, "expected expression")?;
}
let frame = self.ctx.idents.len(); let frame = self.ctx.idents.len();
let token @ Token { start: pos, .. } = self.next(); let token @ Token { start: pos, .. } = self.next();
let prev_boundary = self.ns_bound; let prev_boundary = self.ns_bound;
let prev_captured = self.ctx.captured.len(); let prev_captured = self.ctx.captured.len();
let mut must_trail = false;
let mut expr = match token.kind { let mut expr = match token.kind {
T::Ct => E::Ct { pos, value: self.ptr_expr()? }, T::Defer => E::Defer { pos, value: self.ptr_expr()? },
T::Slf => E::Slf { pos },
T::Directive if self.lexer.slice(token.range()) == "use" => { T::Directive if self.lexer.slice(token.range()) == "use" => {
self.expect_advance(TokenKind::LParen)?; self.expect_advance(TokenKind::LParen)?;
let str = self.expect_advance(TokenKind::DQuote)?; let str = self.expect_advance(TokenKind::DQuote)?;
@ -299,7 +321,7 @@ impl<'a, 'b> Parser<'a, 'b> {
pos, pos,
path, path,
id: match (self.loader)(path, self.path, FileKind::Module) { id: match (self.loader)(path, self.path, FileKind::Module) {
Ok(id) => id, Ok(id) => Module::new(id),
Err(e) => { Err(e) => {
self.report(str.start, format_args!("error loading dependency: {e:#}"))? self.report(str.start, format_args!("error loading dependency: {e:#}"))?
} }
@ -317,7 +339,7 @@ impl<'a, 'b> Parser<'a, 'b> {
pos, pos,
path, path,
id: match (self.loader)(path, self.path, FileKind::Embed) { id: match (self.loader)(path, self.path, FileKind::Embed) {
Ok(id) => id, Ok(id) => Global::new(id),
Err(e) => self.report( Err(e) => self.report(
str.start, str.start,
format_args!("error loading embedded file: {e:#}"), format_args!("error loading embedded file: {e:#}"),
@ -339,6 +361,7 @@ impl<'a, 'b> Parser<'a, 'b> {
T::Idk => E::Idk { pos }, T::Idk => E::Idk { pos },
T::Die => E::Die { pos }, T::Die => E::Die { pos },
T::DQuote => E::String { pos, literal: self.tok_str(token) }, T::DQuote => E::String { pos, literal: self.tok_str(token) },
T::Quote => E::Char { pos, literal: self.tok_str(token) },
T::Packed => { T::Packed => {
self.packed = true; self.packed = true;
let expr = self.unit_expr()?; let expr = self.unit_expr()?;
@ -352,45 +375,62 @@ impl<'a, 'b> Parser<'a, 'b> {
expr expr
} }
T::Struct => E::Struct { T::Struct => E::Struct {
pos,
packed: core::mem::take(&mut self.packed), packed: core::mem::take(&mut self.packed),
fields: { fields: self.collect_fields(&mut must_trail, |s| {
self.ns_bound = self.ctx.idents.len(); if s.lexer.taste().kind != T::Colon {
self.expect_advance(T::LBrace)?; return Some(None);
self.collect_list(T::Comma, T::RBrace, |s| { }
let tok = s.token;
Some(if s.advance_if(T::Comment) {
CommentOr::Comment { literal: s.tok_str(tok), pos: tok.start }
} else {
let name = s.expect_advance(T::Ident)?; let name = s.expect_advance(T::Ident)?;
s.expect_advance(T::Colon)?; s.expect_advance(T::Colon)?;
CommentOr::Or(StructField { let (ty, default_value) = match s.expr()? {
Expr::BinOp { left, op: T::Assign, right, .. } => (*left, Some(*right)),
ty => (ty, None),
};
Some(Some(StructField {
pos: name.start, pos: name.start,
name: s.tok_str(name), name: s.tok_str(name),
ty: s.expr()?, ty,
}) default_value,
}) }))
}) })?,
captured: self.collect_captures(prev_boundary, prev_captured),
trailing_comma: core::mem::take(&mut self.trailing_sep) || must_trail,
}, },
captured: { T::Union => E::Union {
self.ns_bound = prev_boundary; pos,
let captured = &mut self.ctx.captured[prev_captured..]; fields: self.collect_fields(&mut must_trail, |s| {
crate::quad_sort(captured, core::cmp::Ord::cmp); if s.lexer.taste().kind != T::Colon {
let preserved = captured.partition_dedup().0.len(); return Some(None);
self.ctx.captured.truncate(prev_captured + preserved);
self.arena.alloc_slice(&self.ctx.captured[prev_captured..])
},
pos: {
if self.ns_bound == 0 {
// we might save some memory
self.ctx.captured.clear();
} }
pos let name = s.expect_advance(T::Ident)?;
s.expect_advance(T::Colon)?;
Some(Some(UnionField { pos: name.start, name: s.tok_str(name), ty: s.expr()? }))
})?,
captured: self.collect_captures(prev_boundary, prev_captured),
trailing_comma: core::mem::take(&mut self.trailing_sep) || must_trail,
}, },
trailing_comma: core::mem::take(&mut self.trailing_sep), T::Enum => E::Enum {
pos,
variants: self.collect_fields(&mut must_trail, |s| {
if !matches!(s.lexer.taste().kind, T::Comma | T::RBrace) {
return Some(None);
}
let name = s.expect_advance(T::Ident)?;
Some(Some(EnumField { pos: name.start, name: s.tok_str(name) }))
})?,
captured: self.collect_captures(prev_boundary, prev_captured),
trailing_comma: core::mem::take(&mut self.trailing_sep) || must_trail,
}, },
T::Ident | T::CtIdent => { T::Ident | T::CtIdent => {
let (id, is_first) = self.resolve_ident(token); let (id, is_first) = self.resolve_ident(token);
E::Ident { pos, is_ct: token.kind == T::CtIdent, id, is_first } E::Ident {
pos: pos - (token.kind == T::CtIdent) as Pos,
is_ct: token.kind == T::CtIdent,
id,
is_first,
}
} }
T::Under => E::Wildcard { pos }, T::Under => E::Wildcard { pos },
T::If => E::If { T::If => E::If {
@ -399,7 +439,22 @@ impl<'a, 'b> Parser<'a, 'b> {
then: self.ptr_expr()?, then: self.ptr_expr()?,
else_: self.advance_if(T::Else).then(|| self.ptr_expr()).trans()?, else_: self.advance_if(T::Else).then(|| self.ptr_expr()).trans()?,
}, },
T::Loop => E::Loop { pos, body: self.ptr_expr()? }, T::Match => E::Match {
pos,
value: self.ptr_expr()?,
branches: {
self.expect_advance(T::LBrace)?;
self.collect_list(T::Comma, T::RBrace, |s| {
Some(MatchBranch {
pat: s.expr()?,
pos: s.expect_advance(T::TArrow)?.start,
body: s.expr()?,
})
})
},
},
T::Loop => E::Loop { pos, unrolled: false, body: self.ptr_expr()? },
T::CtLoop => E::Loop { pos, unrolled: true, body: self.ptr_expr()? },
T::Break => E::Break { pos }, T::Break => E::Break { pos },
T::Continue => E::Continue { pos }, T::Continue => E::Continue { pos },
T::Return => E::Return { T::Return => E::Return {
@ -418,7 +473,7 @@ impl<'a, 'b> Parser<'a, 'b> {
self.collect_list(T::Comma, T::RParen, |s| { self.collect_list(T::Comma, T::RParen, |s| {
let name = s.advance_ident()?; let name = s.advance_ident()?;
let (id, _) = s.resolve_ident(name); let (id, _) = s.resolve_ident(name);
s.declare(name.start, id, true, true); s.declare(name.start, id, true, true, name.kind == T::CtIdent);
s.expect_advance(T::Colon)?; s.expect_advance(T::Colon)?;
Some(Arg { Some(Arg {
pos: name.start, pos: name.start,
@ -436,23 +491,33 @@ impl<'a, 'b> Parser<'a, 'b> {
body: self.ptr_expr()?, body: self.ptr_expr()?,
}, },
T::Ctor => self.ctor(pos, None), T::Ctor => self.ctor(pos, None),
T::Tupl => self.tupl(pos, None), T::Tupl => self.tupl(pos, None, ListKind::Tuple),
T::Arr => self.tupl(pos, None, ListKind::Array),
T::LBrack => E::Slice { T::LBrack => E::Slice {
item: self.ptr_unit_expr()?, size: {
size: self.advance_if(T::Semi).then(|| self.ptr_expr()).trans()?, if self.advance_if(T::RBrack) {
pos: { None
} else {
let adv = self.ptr_expr()?;
self.expect_advance(T::RBrack)?; self.expect_advance(T::RBrack)?;
pos Some(adv)
}
}, },
item: self.arena.alloc(self.unit_expr_low(false)?),
pos,
}, },
T::Band | T::Mul | T::Xor | T::Sub | T::Que => E::UnOp { T::Band | T::Mul | T::Xor | T::Sub | T::Que | T::Not | T::Dot => E::UnOp {
pos, pos,
op: token.kind, op: token.kind,
val: { val: {
let prev_ident_stack = self.ctx.idents.len();
let expr = self.ptr_unit_expr()?; let expr = self.ptr_unit_expr()?;
if token.kind == T::Band { if token.kind == T::Band {
self.flag_idents(*expr, idfl::REFERENCED); self.flag_idents(*expr, idfl::REFERENCED);
} }
if token.kind == T::Dot {
self.ctx.idents.truncate(prev_ident_stack);
}
expr expr
}, },
}, },
@ -491,9 +556,13 @@ impl<'a, 'b> Parser<'a, 'b> {
tok => self.report(token.start, format_args!("unexpected token: {tok}"))?, tok => self.report(token.start, format_args!("unexpected token: {tok}"))?,
}; };
if eat_tail {
loop { loop {
let token = self.token; let token = self.token;
if matches!(token.kind, T::LParen | T::Ctor | T::Dot | T::Tupl | T::LBrack) { if matches!(
token.kind,
T::LParen | T::Ctor | T::Dot | T::Tupl | T::Arr | T::LBrack | T::Colon
) {
self.next(); self.next();
} }
@ -504,14 +573,56 @@ impl<'a, 'b> Parser<'a, 'b> {
trailing_comma: core::mem::take(&mut self.trailing_sep), trailing_comma: core::mem::take(&mut self.trailing_sep),
}, },
T::Ctor => self.ctor(token.start, Some(expr)), T::Ctor => self.ctor(token.start, Some(expr)),
T::Tupl => self.tupl(token.start, Some(expr)), T::Tupl => self.tupl(token.start, Some(expr), ListKind::Tuple),
T::Arr => self.tupl(token.start, Some(expr), ListKind::Array),
T::LBrack => E::Index { T::LBrack => E::Index {
base: self.arena.alloc(expr), base: self.arena.alloc(expr),
index: { index: self.arena.alloc({
let index = self.expr()?; if self.advance_if(T::Range) {
let pos = self.token.start;
if self.advance_if(T::RBrack) {
Expr::Range { pos, start: None, end: None }
} else {
let res = Expr::Range {
pos,
start: None,
end: Some(self.ptr_expr()?),
};
self.expect_advance(T::RBrack)?; self.expect_advance(T::RBrack)?;
self.arena.alloc(index) res
}
} else {
let start = self.expr()?;
let pos = self.token.start;
if self.advance_if(T::Range) {
let start = self.arena.alloc(start);
if self.advance_if(T::RBrack) {
Expr::Range { pos, start: Some(start), end: None }
} else {
let res = Expr::Range {
pos,
start: Some(start),
end: Some(self.ptr_expr()?),
};
self.expect_advance(T::RBrack)?;
res
}
} else {
self.expect_advance(T::RBrack)?;
start
}
}
}),
}, },
T::Colon => E::BinOp {
left: {
self.declare_rec(&expr, false);
self.arena.alloc(expr)
},
pos,
op: T::Colon,
right: self.ptr_expr()?,
}, },
T::Dot => E::Field { T::Dot => E::Field {
target: self.arena.alloc(expr), target: self.arena.alloc(expr),
@ -524,19 +635,21 @@ impl<'a, 'b> Parser<'a, 'b> {
_ => break, _ => break,
} }
} }
}
if matches!(token.kind, T::Loop | T::LBrace | T::Fn) { if matches!(token.kind, T::Loop | T::LBrace | T::Fn | T::Struct) {
self.pop_scope(frame); self.pop_scope(frame);
} }
Some(expr) Some(expr)
} }
fn tupl(&mut self, pos: Pos, ty: Option<Expr<'a>>) -> Expr<'a> { fn tupl(&mut self, pos: Pos, ty: Option<Expr<'a>>, kind: ListKind) -> Expr<'a> {
Expr::Tupl { Expr::List {
pos, pos,
kind,
ty: ty.map(|ty| self.arena.alloc(ty)), ty: ty.map(|ty| self.arena.alloc(ty)),
fields: self.collect_list(TokenKind::Comma, TokenKind::RParen, Self::expr), fields: self.collect_list(TokenKind::Comma, kind.term(), Self::expr),
trailing_comma: core::mem::take(&mut self.trailing_sep), trailing_comma: core::mem::take(&mut self.trailing_sep),
} }
} }
@ -563,6 +676,49 @@ impl<'a, 'b> Parser<'a, 'b> {
} }
} }
fn collect_fields<T: Copy>(
&mut self,
must_trail: &mut bool,
mut parse_field: impl FnMut(&mut Self) -> Option<Option<T>>,
) -> Option<FieldList<'a, T>> {
use TokenKind as T;
self.ns_bound = self.ctx.idents.len();
self.expect_advance(T::LBrace)?;
Some(self.collect_list(T::Comma, T::RBrace, |s| {
let tok = s.token;
Some(if s.advance_if(T::Comment) {
CommentOr::Comment { literal: s.tok_str(tok), pos: tok.start }
} else if let Some(field) = parse_field(s)? {
CommentOr::Or(Ok(field))
} else {
*must_trail = true;
CommentOr::Or(Err(
s.collect_list_low(T::Semi, T::RBrace, true, |s| s.expr_low(true))
))
})
}))
}
fn collect_captures(
&mut self,
prev_captured: usize,
prev_boundary: usize,
) -> &'a [CapturedIdent] {
self.ns_bound = prev_boundary;
let captured = &mut self.ctx.captured[prev_captured..];
crate::quad_sort(captured, core::cmp::Ord::cmp);
let preserved = captured.partition_dedup().0.len();
self.ctx.captured.truncate(prev_captured + preserved);
let slc = self.arena.alloc_slice(&self.ctx.captured[prev_captured..]);
if self.ns_bound == 0 {
// we might save some memory
self.ctx.captured.clear();
}
slc
}
fn advance_ident(&mut self) -> Option<Token> { fn advance_ident(&mut self) -> Option<Token> {
let next = self.next(); let next = self.next();
if matches!(next.kind, TokenKind::Ident | TokenKind::CtIdent) { if matches!(next.kind, TokenKind::Ident | TokenKind::CtIdent) {
@ -578,6 +734,8 @@ impl<'a, 'b> Parser<'a, 'b> {
if !&self.ctx.idents[i].declared { if !&self.ctx.idents[i].declared {
self.ctx.idents.swap(i, undeclared_count); self.ctx.idents.swap(i, undeclared_count);
undeclared_count += 1; undeclared_count += 1;
} else if !self.ctx.idents[i].used {
self.warn(self.ctx.idents[i].ident.pos(), "unused identifier");
} }
} }
@ -596,11 +754,23 @@ impl<'a, 'b> Parser<'a, 'b> {
&mut self, &mut self,
delim: TokenKind, delim: TokenKind,
end: TokenKind, end: TokenKind,
f: impl FnMut(&mut Self) -> Option<T>,
) -> &'a [T] {
self.collect_list_low(delim, end, false, f)
}
fn collect_list_low<T: Copy>(
&mut self,
delim: TokenKind,
end: TokenKind,
keep_end: bool,
mut f: impl FnMut(&mut Self) -> Option<T>, mut f: impl FnMut(&mut Self) -> Option<T>,
) -> &'a [T] { ) -> &'a [T] {
let mut trailing_sep = false; let mut trailing_sep = false;
let mut view = self.ctx.stack.view(); let mut view = self.ctx.stack.view();
'o: while !self.advance_if(end) { 'o: while (keep_end && self.token.kind != end)
|| (!keep_end && !self.advance_if(end)) && self.token.kind != TokenKind::Eof
{
let val = match f(self) { let val = match f(self) {
Some(val) => val, Some(val) => val,
None => { None => {
@ -657,9 +827,25 @@ impl<'a, 'b> Parser<'a, 'b> {
} }
} }
#[track_caller]
fn warn(&mut self, pos: Pos, msg: impl fmt::Display) {
if log::log_enabled!(log::Level::Error) {
use core::fmt::Write;
writeln!(
self.ctx.warnings.get_mut(),
"(W) {}",
Report::new(self.lexer.source(), self.path, pos, msg)
)
.unwrap();
}
}
#[track_caller] #[track_caller]
fn report(&mut self, pos: Pos, msg: impl fmt::Display) -> Option<!> { fn report(&mut self, pos: Pos, msg: impl fmt::Display) -> Option<!> {
if log::log_enabled!(log::Level::Error) { if log::log_enabled!(log::Level::Error) {
if self.ctx.errors.get_mut().len() > 1024 * 10 {
panic!("{}", self.ctx.errors.get_mut());
}
use core::fmt::Write; use core::fmt::Write;
writeln!( writeln!(
self.ctx.errors.get_mut(), self.ctx.errors.get_mut(),
@ -673,15 +859,19 @@ impl<'a, 'b> Parser<'a, 'b> {
fn flag_idents(&mut self, e: Expr<'a>, flags: IdentFlags) { fn flag_idents(&mut self, e: Expr<'a>, flags: IdentFlags) {
match e { match e {
Expr::Ident { id, .. } => find_ident(&mut self.ctx.idents, id).flags |= flags, Expr::Ident { id, .. } => {
if let Some(f) = find_ident(&mut self.ctx.idents, id) {
f.flags |= flags;
}
}
Expr::Field { target, .. } => self.flag_idents(*target, flags), Expr::Field { target, .. } => self.flag_idents(*target, flags),
_ => {} _ => {}
} }
} }
} }
fn find_ident(idents: &mut [ScopeIdent], id: Ident) -> &mut ScopeIdent { fn find_ident(idents: &mut [ScopeIdent], id: Ident) -> Option<&mut ScopeIdent> {
idents.binary_search_by_key(&id, |si| si.ident).map(|i| &mut idents[i]).unwrap() idents.binary_search_by_key(&id, |si| si.ident).map(|i| &mut idents[i]).ok()
} }
pub fn find_symbol(symbols: &[Symbol], id: Ident) -> &Symbol { pub fn find_symbol(symbols: &[Symbol], id: Ident) -> &Symbol {
@ -755,21 +945,32 @@ pub enum Radix {
Decimal = 10, Decimal = 10,
} }
pub type FieldList<'a, T> = &'a [CommentOr<'a, Result<T, &'a [Expr<'a>]>>];
generate_expr! { generate_expr! {
/// `LIST(start, sep, end, elem) => start { elem sep } [elem] end` /// `LIST(start, sep, end, elem) => start { elem sep } [elem] end`
/// `OP := grep for `#define OP:` /// `OP := grep for `#define OP:`
#[derive(Debug, Clone, Copy, PartialEq, Eq)] #[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub enum Expr<'a> { pub enum Expr<'a> {
/// `'ct' Expr` /// `'defer' Expr`
Ct { Defer {
pos: Pos, pos: Pos,
value: &'a Self, value: &'a Self,
}, },
/// `'Self'`
Slf {
pos: Pos,
},
/// `'"([^"]|\\")"'` /// `'"([^"]|\\")"'`
String { String {
pos: Pos, pos: Pos,
literal: &'a str, literal: &'a str,
}, },
/// `'\'([^']|\\\')\''`
Char {
pos: Pos,
literal: &'a str,
},
/// `'//[^\n]' | '/*' { '([^/*]|*/)*' | Comment } '*/' /// `'//[^\n]' | '/*' { '([^/*]|*/)*' | Comment } '*/'
Comment { Comment {
pos: Pos, pos: Pos,
@ -844,9 +1045,15 @@ generate_expr! {
then: &'a Self, then: &'a Self,
else_: Option<&'a Self>, else_: Option<&'a Self>,
}, },
Match {
pos: Pos,
value: &'a Self,
branches: &'a [MatchBranch<'a>],
},
/// `'loop' Expr` /// `'loop' Expr`
Loop { Loop {
pos: Pos, pos: Pos,
unrolled: bool,
body: &'a Self, body: &'a Self,
}, },
/// `('&' | '*' | '^') Expr` /// `('&' | '*' | '^') Expr`
@ -858,11 +1065,25 @@ generate_expr! {
/// `'struct' LIST('{', ',', '}', Ident ':' Expr)` /// `'struct' LIST('{', ',', '}', Ident ':' Expr)`
Struct { Struct {
pos: Pos, pos: Pos,
fields: &'a [CommentOr<'a, StructField<'a>>], fields: FieldList<'a, StructField<'a>>,
captured: &'a [Ident], captured: &'a [CapturedIdent],
trailing_comma: bool, trailing_comma: bool,
packed: bool, packed: bool,
}, },
/// `'union' LIST('{', ',', '}', Ident ':' Expr)`
Union {
pos: Pos,
fields: FieldList<'a, UnionField<'a>>,
captured: &'a [CapturedIdent],
trailing_comma: bool,
},
/// `'enum' LIST('{', ',', '}', Ident)`
Enum {
pos: Pos,
variants: FieldList<'a, EnumField<'a>>,
captured: &'a [CapturedIdent],
trailing_comma: bool,
},
/// `[Expr] LIST('.{', ',', '}', Ident [':' Expr])` /// `[Expr] LIST('.{', ',', '}', Ident [':' Expr])`
Ctor { Ctor {
pos: Pos, pos: Pos,
@ -871,8 +1092,9 @@ generate_expr! {
trailing_comma: bool, trailing_comma: bool,
}, },
/// `[Expr] LIST('.(', ',', ')', Ident [':' Expr])` /// `[Expr] LIST('.(', ',', ')', Ident [':' Expr])`
Tupl { List {
pos: Pos, pos: Pos,
kind: ListKind,
ty: Option<&'a Self>, ty: Option<&'a Self>,
fields: &'a [Self], fields: &'a [Self],
trailing_comma: bool, trailing_comma: bool,
@ -888,6 +1110,12 @@ generate_expr! {
base: &'a Self, base: &'a Self,
index: &'a Self, index: &'a Self,
}, },
/// `[ Expr ] .. [ Expr ]`
Range {
pos: u32,
start: Option<&'a Self>,
end: Option<&'a Self>,
},
/// `Expr '.' Ident` /// `Expr '.' Ident`
Field { Field {
target: &'a Self, target: &'a Self,
@ -921,22 +1149,44 @@ generate_expr! {
/// `'@use' '(' String ')'` /// `'@use' '(' String ')'`
Mod { Mod {
pos: Pos, pos: Pos,
id: FileId, id: Module,
path: &'a str, path: &'a str,
}, },
/// `'@use' '(' String ')'` /// `'@use' '(' String ')'`
Embed { Embed {
pos: Pos, pos: Pos,
id: FileId, id: Global,
path: &'a str, path: &'a str,
}, },
} }
} }
#[derive(Clone, Copy, PartialEq, Eq, Debug, PartialOrd, Ord)]
pub struct CapturedIdent {
pub id: Ident,
pub is_ct: bool,
}
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub enum ListKind {
Tuple,
Array,
}
impl ListKind {
fn term(self) -> TokenKind {
match self {
ListKind::Tuple => TokenKind::RParen,
ListKind::Array => TokenKind::RBrack,
}
}
}
impl Expr<'_> { impl Expr<'_> {
pub fn declares(&self, iden: Result<Ident, &str>, source: &str) -> Option<Ident> { pub fn declares(&self, iden: DeclId, source: &str) -> Option<Ident> {
match *self { match *self {
Self::Ident { id, .. } if iden == Ok(id) || iden == Err(&source[id.range()]) => { Self::Ident { id, .. }
if iden == DeclId::Ident(id) || iden == DeclId::Name(&source[id.range()]) =>
{
Some(id) Some(id)
} }
Self::Ctor { fields, .. } => fields.iter().find_map(|f| f.value.declares(iden, source)), Self::Ctor { fields, .. } => fields.iter().find_map(|f| f.value.declares(iden, source)),
@ -952,14 +1202,14 @@ impl Expr<'_> {
} }
} }
pub fn find_pattern_path<T, F: FnOnce(&Expr) -> T>( pub fn find_pattern_path<T, F: FnOnce(&Expr, bool) -> T>(
&self, &self,
ident: Ident, ident: Ident,
target: &Expr, target: &Expr,
mut with_final: F, mut with_final: F,
) -> Result<T, F> { ) -> Result<T, F> {
match *self { match *self {
Self::Ident { id, .. } if id == ident => Ok(with_final(target)), Self::Ident { id, is_ct, .. } if id == ident => Ok(with_final(target, is_ct)),
Self::Ctor { fields, .. } => { Self::Ctor { fields, .. } => {
for &CtorField { name, value, pos } in fields { for &CtorField { name, value, pos } in fields {
match value.find_pattern_path( match value.find_pattern_path(
@ -978,11 +1228,50 @@ impl Expr<'_> {
} }
} }
#[derive(Clone, Copy, PartialEq, Eq, Debug)]
pub struct MatchBranch<'a> {
pub pat: Expr<'a>,
pub pos: Pos,
pub body: Expr<'a>,
}
impl Poser for MatchBranch<'_> {
fn posi(&self) -> Pos {
self.pat.pos()
}
}
#[derive(Clone, Copy, PartialEq, Eq, Debug)]
pub struct EnumField<'a> {
pub pos: Pos,
pub name: &'a str,
}
impl Poser for EnumField<'_> {
fn posi(&self) -> Pos {
self.pos
}
}
#[derive(Clone, Copy, PartialEq, Eq, Debug)]
pub struct UnionField<'a> {
pub pos: Pos,
pub name: &'a str,
pub ty: Expr<'a>,
}
impl Poser for UnionField<'_> {
fn posi(&self) -> Pos {
self.pos
}
}
#[derive(Clone, Copy, PartialEq, Eq, Debug)] #[derive(Clone, Copy, PartialEq, Eq, Debug)]
pub struct StructField<'a> { pub struct StructField<'a> {
pub pos: Pos, pub pos: Pos,
pub name: &'a str, pub name: &'a str,
pub ty: Expr<'a>, pub ty: Expr<'a>,
pub default_value: Option<Expr<'a>>,
} }
impl Poser for StructField<'_> { impl Poser for StructField<'_> {
@ -1008,6 +1297,18 @@ pub trait Poser {
fn posi(&self) -> Pos; fn posi(&self) -> Pos;
} }
impl<O: Poser, E: Poser> Poser for Result<O, E> {
fn posi(&self) -> Pos {
self.as_ref().map_or_else(Poser::posi, Poser::posi)
}
}
impl<T: Poser> Poser for &[T] {
fn posi(&self) -> Pos {
self[0].posi()
}
}
impl Poser for Pos { impl Poser for Pos {
fn posi(&self) -> Pos { fn posi(&self) -> Pos {
*self *self
@ -1035,9 +1336,9 @@ pub enum CommentOr<'a, T> {
Comment { literal: &'a str, pos: Pos }, Comment { literal: &'a str, pos: Pos },
} }
impl<T: Copy> CommentOr<'_, T> { impl<T> CommentOr<'_, T> {
pub fn or(&self) -> Option<T> { pub fn or(&self) -> Option<&T> {
match *self { match self {
CommentOr::Or(v) => Some(v), CommentOr::Or(v) => Some(v),
CommentOr::Comment { .. } => None, CommentOr::Comment { .. } => None,
} }
@ -1064,10 +1365,11 @@ impl core::fmt::Display for Display<'_> {
#[derive(Default)] #[derive(Default)]
pub struct Ctx { pub struct Ctx {
pub errors: RefCell<String>, pub errors: RefCell<String>,
pub warnings: RefCell<String>,
symbols: Vec<Symbol>, symbols: Vec<Symbol>,
stack: StackAlloc, stack: StackAlloc,
idents: Vec<ScopeIdent>, idents: Vec<ScopeIdent>,
captured: Vec<Ident>, captured: Vec<CapturedIdent>,
} }
impl Ctx { impl Ctx {
@ -1206,17 +1508,43 @@ impl Ast {
unsafe { self.0.as_ref() } unsafe { self.0.as_ref() }
} }
pub fn find_decl(&self, id: Result<Ident, &str>) -> Option<(&Expr, Ident)> { pub fn find_decl(&self, id: DeclId) -> Option<(&Expr, Ident)> {
self.exprs().iter().find_map(|expr| match expr { find_decl(self.exprs(), &self.file, id)
Expr::BinOp { left, op: TokenKind::Decl, .. } => { }
left.declares(id, &self.file).map(|id| (expr, id))
pub fn ident_str(&self, ident: Ident) -> &str {
&self.file[ident.range()]
}
}
pub fn find_decl<'a>(
exprs: &'a [Expr<'a>],
file: &str,
id: DeclId,
) -> Option<(&'a Expr<'a>, Ident)> {
exprs.iter().find_map(|expr| match expr {
Expr::BinOp { left, op: TokenKind::Decl | TokenKind::Colon, .. } => {
left.declares(id, file).map(|id| (expr, id))
} }
_ => None, _ => None,
}) })
} }
pub fn ident_str(&self, ident: Ident) -> &str { #[derive(PartialEq, Eq, Clone, Copy)]
&self.file[ident.range()] pub enum DeclId<'a> {
Ident(Ident),
Name(&'a str),
}
impl From<Ident> for DeclId<'_> {
fn from(value: Ident) -> Self {
Self::Ident(value)
}
}
impl<'a> From<&'a str> for DeclId<'a> {
fn from(value: &'a str) -> Self {
Self::Name(value)
} }
} }
@ -1226,7 +1554,7 @@ impl Default for Ast {
} }
} }
#[derive(Clone, Copy)] #[derive(Clone, Copy, PartialEq, Eq, Hash)]
#[repr(packed)] #[repr(packed)]
pub struct ExprRef(NonNull<Expr<'static>>); pub struct ExprRef(NonNull<Expr<'static>>);
@ -1492,7 +1820,7 @@ impl ArenaChunk {
fn contains(&self, arg: *mut u8) -> bool { fn contains(&self, arg: *mut u8) -> bool {
(self.base <= arg && unsafe { self.base.add(self.size) } > arg) (self.base <= arg && unsafe { self.base.add(self.size) } > arg)
|| self.next().map_or(false, |s| s.contains(arg)) || self.next().is_some_and(|s| s.contains(arg))
} }
pub fn size(&self) -> usize { pub fn size(&self) -> usize {

File diff suppressed because it is too large Load diff

View file

@ -1,903 +0,0 @@
use {
super::{HbvmBackend, Nid, Nodes},
crate::{
lexer::TokenKind,
parser,
reg::{self, Reg},
son::{debug_assert_matches, Kind, ARG_START, MEM, VOID},
ty::{self, Arg, Loc},
utils::{BitSet, Vc},
Offset, PLoc, Reloc, Sig, TypedReloc, Types,
},
alloc::{borrow::ToOwned, vec::Vec},
core::{mem, ops::Range},
hbbytecode::{self as instrs},
};
impl HbvmBackend {
pub fn emit_body_code_my(
&mut self,
nodes: &mut Nodes,
sig: Sig,
tys: &Types,
files: &[parser::Ast],
) -> (usize, bool) {
let mut fuc = Function::new(nodes, tys, sig);
log::info!("{fuc:?}");
let mut res = mem::take(&mut self.ralloc_my);
Env::new(&fuc, &fuc.func, &mut res).run();
'_open_function: {
self.emit(instrs::addi64(reg::STACK_PTR, reg::STACK_PTR, 0));
self.emit(instrs::st(reg::RET_ADDR + fuc.tail as u8, reg::STACK_PTR, 0, 0));
}
let reg_offset = if fuc.tail { reg::RET + 12 } else { reg::RET_ADDR + 1 };
res.node_to_reg.iter_mut().filter(|r| **r != 0).for_each(|r| {
*r += reg_offset - 1;
if fuc.tail && *r >= reg::RET_ADDR {
*r += 1;
}
});
let atr = |allc: Nid| res.node_to_reg[allc as usize];
//for (id, node) in fuc.nodes.iter() {
// if node.kind == Kind::Phi {
// debug_assert_eq!(atr(node.inputs[1]), atr(node.inputs[2]));
// debug_assert_eq!(atr(id), atr(node.inputs[2]));
// }
//}
let (retl, mut parama) = tys.parama(sig.ret);
let mut typs = sig.args.args();
let mut args = fuc.nodes[VOID].outputs[ARG_START..].iter();
while let Some(aty) = typs.next(tys) {
let Arg::Value(ty) = aty else { continue };
let Some(loc) = parama.next(ty, tys) else { continue };
let &arg = args.next().unwrap();
let (rg, size) = match loc {
PLoc::WideReg(rg, size) => (rg, size),
PLoc::Reg(rg, size) if ty.loc(tys) == Loc::Stack => (rg, size),
PLoc::Reg(r, ..) | PLoc::Ref(r, ..) => {
self.emit(instrs::cp(atr(arg), r));
continue;
}
};
self.emit(instrs::st(rg, reg::STACK_PTR, self.offsets[arg as usize] as _, size));
if fuc.nodes[arg].lock_rc == 0 {
self.emit(instrs::addi64(rg, reg::STACK_PTR, self.offsets[arg as usize] as _));
}
self.emit(instrs::cp(atr(arg), rg));
}
for (i, block) in fuc.func.blocks.iter().enumerate() {
self.offsets[block.entry as usize] = self.code.len() as _;
for &nid in &fuc.func.instrs[block.range.clone()] {
if nid == VOID {
continue;
}
let node = &fuc.nodes[nid];
let extend = |base: ty::Id, dest: ty::Id, from: Nid, to: Nid| {
let (bsize, dsize) = (tys.size_of(base), tys.size_of(dest));
debug_assert!(bsize <= 8, "{}", ty::Display::new(tys, files, base));
debug_assert!(dsize <= 8, "{}", ty::Display::new(tys, files, dest));
if bsize == dsize {
return Default::default();
}
match (base.is_signed(), dest.is_signed()) {
(true, true) => {
let op = [instrs::sxt8, instrs::sxt16, instrs::sxt32]
[bsize.ilog2() as usize];
op(atr(to), atr(from))
}
_ => {
let mask = (1u64 << (bsize * 8)) - 1;
instrs::andi(atr(to), atr(from), mask)
}
}
};
match node.kind {
Kind::If => {
let &[_, cnd] = node.inputs.as_slice() else { unreachable!() };
if let Kind::BinOp { op } = fuc.nodes[cnd].kind
&& let Some((op, swapped)) =
op.cond_op(fuc.nodes[fuc.nodes[cnd].inputs[1]].ty)
{
let &[_, lhs, rhs] = fuc.nodes[cnd].inputs.as_slice() else {
unreachable!()
};
self.emit(extend(fuc.nodes[lhs].ty, fuc.nodes[lhs].ty.extend(), 0, 0));
self.emit(extend(fuc.nodes[rhs].ty, fuc.nodes[rhs].ty.extend(), 1, 1));
let rel = Reloc::new(self.code.len(), 3, 2);
self.jump_relocs.push((node.outputs[!swapped as usize], rel));
self.emit(op(atr(lhs), atr(rhs), 0));
} else {
self.emit(extend(fuc.nodes[cnd].ty, fuc.nodes[cnd].ty.extend(), 0, 0));
let rel = Reloc::new(self.code.len(), 3, 2);
self.jump_relocs.push((node.outputs[0], rel));
self.emit(instrs::jne(atr(cnd), reg::ZERO, 0));
}
}
Kind::Loop | Kind::Region => {
if (mem::replace(&mut fuc.backrefs[nid as usize], u16::MAX) != u16::MAX)
^ (node.kind == Kind::Loop)
{
let index = (node.kind == Kind::Loop) as usize + 1;
for &out in node.outputs.iter() {
if fuc.nodes[out].is_data_phi()
&& atr(out) != atr(fuc.nodes[out].inputs[index])
{
self.emit(instrs::cp(
atr(out),
atr(fuc.nodes[out].inputs[index]),
));
}
}
let rel = Reloc::new(self.code.len(), 1, 4);
self.jump_relocs.push((nid, rel));
self.emit(instrs::jmp(0));
} else {
let index = (node.kind != Kind::Loop) as usize + 1;
for &out in node.outputs.iter() {
if fuc.nodes[out].is_data_phi()
&& atr(out) != atr(fuc.nodes[out].inputs[index])
{
self.emit(instrs::cp(
atr(out),
atr(fuc.nodes[out].inputs[index]),
));
}
}
}
}
Kind::Return => {
let &[_, mut ret, ..] = node.inputs.as_slice() else { unreachable!() };
match retl {
None => {}
Some(PLoc::Reg(r, _)) if sig.ret.loc(tys) == Loc::Reg => {
self.emit(instrs::cp(r, atr(ret)));
}
Some(PLoc::Reg(r, size)) | Some(PLoc::WideReg(r, size)) => {
ret = match fuc.nodes[ret].kind {
Kind::Load { .. } => fuc.nodes[ret].inputs[1],
_ => ret,
};
self.emit(instrs::ld(r, atr(ret), 0, size))
}
Some(PLoc::Ref(_, size)) => {
ret = match fuc.nodes[ret].kind {
Kind::Load { .. } => fuc.nodes[ret].inputs[1],
_ => ret,
};
let [src, dst] = [atr(ret), atr(MEM)];
if let Ok(size) = u16::try_from(size) {
self.emit(instrs::bmc(src, dst, size));
} else {
for _ in 0..size / u16::MAX as u32 {
self.emit(instrs::bmc(src, dst, u16::MAX));
self.emit(instrs::addi64(src, src, u16::MAX as _));
self.emit(instrs::addi64(dst, dst, u16::MAX as _));
}
self.emit(instrs::bmc(src, dst, size as u16));
self.emit(instrs::addi64(src, src, size.wrapping_neg() as _));
self.emit(instrs::addi64(dst, dst, size.wrapping_neg() as _));
}
}
}
if i != fuc.func.blocks.len() - 1 {
let rel = Reloc::new(self.code.len(), 1, 4);
self.ret_relocs.push(rel);
self.emit(instrs::jmp(0));
}
}
Kind::Die => self.emit(instrs::un()),
Kind::CInt { value } if node.ty.is_float() => {
self.emit(match node.ty {
ty::Id::F32 => instrs::li32(
atr(nid),
(f64::from_bits(value as _) as f32).to_bits(),
),
ty::Id::F64 => instrs::li64(atr(nid), value as _),
_ => unreachable!(),
});
}
Kind::CInt { value } => self.emit(match tys.size_of(node.ty) {
1 => instrs::li8(atr(nid), value as _),
2 => instrs::li16(atr(nid), value as _),
4 => instrs::li32(atr(nid), value as _),
_ => instrs::li64(atr(nid), value as _),
}),
Kind::UnOp { op } => {
let op = op
.unop(node.ty, fuc.nodes[node.inputs[1]].ty)
.expect("TODO: unary operator not supported");
self.emit(op(atr(nid), atr(node.inputs[1])));
}
Kind::BinOp { .. } if node.lock_rc != 0 => {}
Kind::BinOp { op } => {
let &[.., lhs, rhs] = node.inputs.as_slice() else { unreachable!() };
if let Kind::CInt { value } = fuc.nodes[rhs].kind
&& fuc.nodes[rhs].lock_rc != 0
&& let Some(op) = op.imm_binop(node.ty)
{
self.emit(op(atr(nid), atr(lhs), value as _));
} else if let Some(op) =
op.binop(node.ty).or(op.float_cmp(fuc.nodes[lhs].ty))
{
self.emit(op(atr(nid), atr(lhs), atr(rhs)));
} else if let Some(against) = op.cmp_against() {
let op_ty = fuc.nodes[lhs].ty;
self.emit(extend(fuc.nodes[lhs].ty, fuc.nodes[lhs].ty.extend(), 0, 0));
self.emit(extend(fuc.nodes[rhs].ty, fuc.nodes[rhs].ty.extend(), 1, 1));
if op_ty.is_float() && matches!(op, TokenKind::Le | TokenKind::Ge) {
let opop = match op {
TokenKind::Le => TokenKind::Gt,
TokenKind::Ge => TokenKind::Lt,
_ => unreachable!(),
};
let op_fn = opop.float_cmp(op_ty).unwrap();
self.emit(op_fn(atr(nid), atr(lhs), atr(rhs)));
self.emit(instrs::not(atr(nid), atr(nid)));
} else if op_ty.is_integer() {
let op_fn =
if op_ty.is_signed() { instrs::cmps } else { instrs::cmpu };
self.emit(op_fn(atr(nid), atr(lhs), atr(rhs)));
self.emit(instrs::cmpui(atr(nid), atr(nid), against));
if matches!(op, TokenKind::Eq | TokenKind::Lt | TokenKind::Gt) {
self.emit(instrs::not(atr(nid), atr(nid)));
}
} else {
todo!("unhandled operator: {op}");
}
} else {
todo!("unhandled operator: {op}");
}
}
Kind::Call { args, func } => {
let (ret, mut parama) = tys.parama(node.ty);
let mut args = args.args();
let mut allocs = node.inputs[1..].iter();
while let Some(arg) = args.next(tys) {
let Arg::Value(ty) = arg else { continue };
let Some(loc) = parama.next(ty, tys) else { continue };
let mut arg = *allocs.next().unwrap();
let (rg, size) = match loc {
PLoc::Reg(rg, size) if ty.loc(tys) == Loc::Stack => (rg, size),
PLoc::WideReg(rg, size) => (rg, size),
PLoc::Ref(r, ..) => {
arg = match fuc.nodes[arg].kind {
Kind::Load { .. } => fuc.nodes[arg].inputs[1],
_ => arg,
};
self.emit(instrs::cp(r, atr(arg)));
continue;
}
PLoc::Reg(r, ..) => {
self.emit(instrs::cp(r, atr(arg)));
continue;
}
};
arg = match fuc.nodes[arg].kind {
Kind::Load { .. } => fuc.nodes[arg].inputs[1],
_ => arg,
};
self.emit(instrs::ld(rg, atr(arg), 0, size));
}
debug_assert!(
!matches!(ret, Some(PLoc::Ref(..))) || allocs.next().is_some()
);
if func == ty::ECA {
self.emit(instrs::eca());
} else {
self.relocs.push(TypedReloc {
target: ty::Kind::Func(func).compress(),
reloc: Reloc::new(self.code.len(), 3, 4),
});
self.emit(instrs::jal(reg::RET_ADDR, reg::ZERO, 0));
}
match ret {
Some(PLoc::WideReg(r, size)) => {
debug_assert_eq!(
fuc.nodes[*node.inputs.last().unwrap()].kind,
Kind::Stck
);
let stck = self.offsets[*node.inputs.last().unwrap() as usize];
self.emit(instrs::st(r, reg::STACK_PTR, stck as _, size));
}
Some(PLoc::Reg(r, size)) if node.ty.loc(tys) == Loc::Stack => {
debug_assert_eq!(
fuc.nodes[*node.inputs.last().unwrap()].kind,
Kind::Stck
);
let stck = self.offsets[*node.inputs.last().unwrap() as usize];
self.emit(instrs::st(r, reg::STACK_PTR, stck as _, size));
}
Some(PLoc::Reg(r, ..)) => self.emit(instrs::cp(atr(nid), r)),
None | Some(PLoc::Ref(..)) => {}
}
}
Kind::Global { global } => {
let reloc = Reloc::new(self.code.len(), 3, 4);
self.relocs.push(TypedReloc {
target: ty::Kind::Global(global).compress(),
reloc,
});
self.emit(instrs::lra(atr(nid), 0, 0));
}
Kind::Stck => {
let base = reg::STACK_PTR;
let offset = self.offsets[nid as usize];
self.emit(instrs::addi64(atr(nid), base, offset as _));
}
Kind::Load => {
let mut region = node.inputs[1];
let mut offset = 0;
if fuc.nodes[region].kind == (Kind::BinOp { op: TokenKind::Add })
&& let Kind::CInt { value } =
fuc.nodes[fuc.nodes[region].inputs[2]].kind
{
region = fuc.nodes[region].inputs[1];
offset = value as Offset;
}
let size = tys.size_of(node.ty);
if node.ty.loc(tys) != Loc::Stack {
let (base, offset) = match fuc.nodes[region].kind {
Kind::Stck => {
(reg::STACK_PTR, self.offsets[region as usize] + offset)
}
_ => (atr(region), offset),
};
self.emit(instrs::ld(atr(nid), base, offset as _, size as _));
}
}
Kind::Stre if node.inputs[1] == VOID => {}
Kind::Stre => {
let mut region = node.inputs[2];
let mut offset = 0;
let size = u16::try_from(tys.size_of(node.ty)).expect("TODO");
if fuc.nodes[region].kind == (Kind::BinOp { op: TokenKind::Add })
&& let Kind::CInt { value } =
fuc.nodes[fuc.nodes[region].inputs[2]].kind
&& node.ty.loc(tys) == Loc::Reg
{
region = fuc.nodes[region].inputs[1];
offset = value as Offset;
}
let nd = &fuc.nodes[region];
let value = node.inputs[1];
let (base, offset, src) = match nd.kind {
Kind::Stck if node.ty.loc(tys) == Loc::Reg => {
(reg::STACK_PTR, self.offsets[region as usize] + offset, value)
}
_ => (atr(region), offset, match fuc.nodes[value].kind {
Kind::Load { .. } => fuc.nodes[value].inputs[1],
_ => value,
}),
};
match node.ty.loc(tys) {
Loc::Reg => self.emit(instrs::st(atr(src), base, offset as _, size)),
Loc::Stack => {
debug_assert_eq!(offset, 0);
self.emit(instrs::bmc(atr(src), base, size))
}
}
}
Kind::Mem => self.emit(instrs::cp(atr(MEM), reg::RET)),
Kind::Arg => {}
e @ (Kind::Start
| Kind::Entry
| Kind::End
| Kind::Loops
| Kind::Then
| Kind::Else
| Kind::Phi
| Kind::Assert { .. }) => unreachable!("{e:?}"),
}
}
}
self.ralloc_my = res;
let bundle_count = self.ralloc_my.bundles.len() + (reg_offset as usize);
(
if fuc.tail {
bundle_count.saturating_sub(reg::RET_ADDR as _)
} else {
assert!(bundle_count < reg::STACK_PTR as usize, "TODO: spill memory");
self.ralloc_my.bundles.len()
},
fuc.tail,
)
}
}
pub struct Function<'a> {
sig: Sig,
tail: bool,
backrefs: Vec<u16>,
nodes: &'a mut Nodes,
tys: &'a Types,
visited: BitSet,
func: Func,
}
impl Function<'_> {
fn vreg_count(&self) -> usize {
self.nodes.values.len()
}
fn uses_of(&self, nid: Nid, buf: &mut Vec<Nid>) {
if self.nodes[nid].kind.is_cfg() && !matches!(self.nodes[nid].kind, Kind::Call { .. }) {
return;
}
self.nodes[nid]
.outputs
.iter()
.filter(|&&n| self.nodes.is_data_dep(nid, n))
.collect_into(buf);
}
fn phi_inputs_of(&self, nid: Nid, buf: &mut Vec<Nid>) {
match self.nodes[nid].kind {
Kind::Region => {
for &inp in self.nodes[nid].outputs.as_slice() {
if self.nodes[inp].is_data_phi() {
buf.extend(&self.nodes[inp].inputs[1..]);
buf.push(inp);
}
}
}
Kind::Loop => {
for &inp in self.nodes[nid].outputs.as_slice() {
if self.nodes[inp].is_data_phi() {
buf.push(self.nodes[inp].inputs[1]);
buf.push(inp);
buf.push(self.nodes[inp].inputs[2]);
}
}
}
_ => {}
}
}
fn instr_of(&self, nid: Nid) -> Option<Nid> {
if self.nodes[nid].kind == Kind::Phi || self.nodes[nid].lock_rc != 0 {
return None;
}
debug_assert_ne!(self.backrefs[nid as usize], Nid::MAX, "{:?}", self.nodes[nid]);
Some(self.backrefs[nid as usize])
}
fn block_of(&self, nid: Nid) -> Nid {
debug_assert!(self.nodes[nid].kind.starts_basic_block());
self.backrefs[nid as usize]
}
fn idom_of(&self, mut nid: Nid) -> Nid {
while !self.nodes[nid].kind.starts_basic_block() {
nid = self.nodes.idom(nid);
}
nid
}
fn use_block(&self, inst: Nid, uinst: Nid) -> Nid {
let mut block = self.nodes.use_block(inst, uinst);
while !self.nodes[block].kind.starts_basic_block() {
block = self.nodes.idom(block);
}
block
}
}
impl core::fmt::Debug for Function<'_> {
fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result {
for block in &self.func.blocks {
writeln!(f, "{:?}", self.nodes[block.entry].kind)?;
for &instr in &self.func.instrs[block.range.clone()] {
writeln!(f, "{:?}", self.nodes[instr].kind)?;
}
}
Ok(())
}
}
impl<'a> Function<'a> {
fn new(nodes: &'a mut Nodes, tys: &'a Types, sig: Sig) -> Self {
let mut s = Self {
backrefs: vec![u16::MAX; nodes.values.len()],
tail: true,
nodes,
tys,
sig,
visited: Default::default(),
func: Default::default(),
};
s.visited.clear(s.nodes.values.len());
s.emit_node(VOID);
s
}
fn add_block(&mut self, entry: Nid) {
self.func
.blocks
.push(Block { range: self.func.instrs.len()..self.func.instrs.len(), entry });
self.backrefs[entry as usize] = self.func.blocks.len() as u16 - 1;
}
fn close_block(&mut self, exit: Nid) {
if !matches!(self.nodes[exit].kind, Kind::Loop | Kind::Region) {
self.add_instr(exit);
} else {
self.func.instrs.push(exit);
}
let prev = self.func.blocks.last_mut().unwrap();
prev.range.end = self.func.instrs.len();
}
fn add_instr(&mut self, nid: Nid) {
debug_assert_ne!(self.nodes[nid].kind, Kind::Loop);
self.backrefs[nid as usize] = self.func.instrs.len() as u16;
self.func.instrs.push(nid);
}
fn emit_node(&mut self, nid: Nid) {
if matches!(self.nodes[nid].kind, Kind::Region | Kind::Loop) {
match (self.nodes[nid].kind, self.visited.set(nid)) {
(Kind::Loop, false) | (Kind::Region, true) => {
self.close_block(nid);
return;
}
_ => {}
}
} else if !self.visited.set(nid) {
return;
}
if self.nodes.is_never_used(nid, self.tys) {
self.nodes.lock(nid);
return;
}
let mut node = self.nodes[nid].clone();
match node.kind {
Kind::Start => {
debug_assert_matches!(self.nodes[node.outputs[0]].kind, Kind::Entry);
self.add_block(VOID);
self.emit_node(node.outputs[0])
}
Kind::If => {
let &[_, cond] = node.inputs.as_slice() else { unreachable!() };
let &[mut then, mut else_] = node.outputs.as_slice() else { unreachable!() };
if let Kind::BinOp { op } = self.nodes[cond].kind
&& let Some((_, swapped)) = op.cond_op(node.ty)
&& swapped
{
mem::swap(&mut then, &mut else_);
}
self.close_block(nid);
self.emit_node(then);
self.emit_node(else_);
}
Kind::Region | Kind::Loop => {
self.close_block(nid);
self.add_block(nid);
self.reschedule_block(nid, &mut node.outputs);
for o in node.outputs.into_iter().rev() {
self.emit_node(o);
}
}
Kind::Return | Kind::Die => {
self.close_block(nid);
self.emit_node(node.outputs[0]);
}
Kind::Entry => {
let (ret, mut parama) = self.tys.parama(self.sig.ret);
let mut typs = self.sig.args.args();
#[expect(clippy::unnecessary_to_owned)]
let mut args = self.nodes[VOID].outputs[ARG_START..].to_owned().into_iter();
while let Some(ty) = typs.next_value(self.tys) {
let arg = args.next().unwrap();
debug_assert_eq!(self.nodes[arg].kind, Kind::Arg);
match parama.next(ty, self.tys) {
None => {}
Some(_) => self.add_instr(arg),
}
}
if let Some(PLoc::Ref(..)) = ret {
self.add_instr(MEM);
}
self.reschedule_block(nid, &mut node.outputs);
for o in node.outputs.into_iter().rev() {
self.emit_node(o);
}
}
Kind::Then | Kind::Else => {
self.add_block(nid);
self.reschedule_block(nid, &mut node.outputs);
for o in node.outputs.into_iter().rev() {
self.emit_node(o);
}
}
Kind::Call { func, .. } => {
self.tail &= func == ty::ECA;
self.add_instr(nid);
self.reschedule_block(nid, &mut node.outputs);
for o in node.outputs.into_iter().rev() {
if self.nodes[o].inputs[0] == nid
|| (matches!(self.nodes[o].kind, Kind::Loop | Kind::Region)
&& self.nodes[o].inputs[1] == nid)
{
self.emit_node(o);
}
}
}
Kind::CInt { .. }
| Kind::BinOp { .. }
| Kind::UnOp { .. }
| Kind::Global { .. }
| Kind::Load { .. }
| Kind::Stre
| Kind::Stck => self.add_instr(nid),
Kind::End | Kind::Phi | Kind::Arg | Kind::Mem | Kind::Loops => {}
Kind::Assert { .. } => unreachable!(),
}
}
fn reschedule_block(&mut self, from: Nid, outputs: &mut Vc) {
let from = Some(&from);
let mut buf = Vec::with_capacity(outputs.len());
let mut seen = BitSet::default();
seen.clear(self.nodes.values.len());
for &o in outputs.iter() {
if !self.nodes.is_cfg(o) {
continue;
}
seen.set(o);
let mut cursor = buf.len();
buf.push(o);
while let Some(&n) = buf.get(cursor) {
for &i in &self.nodes[n].inputs[1..] {
if from == self.nodes[i].inputs.first()
&& self.nodes[i]
.outputs
.iter()
.all(|&o| self.nodes[o].inputs.first() != from || seen.get(o))
&& seen.set(i)
{
buf.push(i);
}
}
cursor += 1;
}
}
for &o in outputs.iter() {
if !seen.set(o) {
continue;
}
let mut cursor = buf.len();
buf.push(o);
while let Some(&n) = buf.get(cursor) {
for &i in &self.nodes[n].inputs[1..] {
if from == self.nodes[i].inputs.first()
&& self.nodes[i]
.outputs
.iter()
.all(|&o| self.nodes[o].inputs.first() != from || seen.get(o))
&& seen.set(i)
{
buf.push(i);
}
}
cursor += 1;
}
}
debug_assert!(
outputs.len() == buf.len() || outputs.len() == buf.len() + 1,
"{:?} {:?}",
outputs,
buf
);
if buf.len() + 1 == outputs.len() {
outputs.remove(outputs.len() - 1);
}
outputs.copy_from_slice(&buf);
}
}
pub struct Env<'a> {
ctx: &'a Function<'a>,
func: &'a Func,
res: &'a mut Res,
}
impl<'a> Env<'a> {
pub fn new(ctx: &'a Function<'a>, func: &'a Func, res: &'a mut Res) -> Self {
Self { ctx, func, res }
}
pub fn run(&mut self) {
self.res.bundles.clear();
self.res.node_to_reg.clear();
self.res.node_to_reg.resize(self.ctx.vreg_count(), 0);
debug_assert!(self.res.dfs_buf.is_empty());
debug_assert!(self.res.use_buf.is_empty());
debug_assert!(self.res.phi_input_buf.is_empty());
let mut bundle = Bundle::new(self.func.instrs.len());
let mut visited = BitSet::with_capacity(self.ctx.nodes.values.len());
let mut use_buf = mem::take(&mut self.res.use_buf);
let mut phi_input_buf = mem::take(&mut self.res.phi_input_buf);
for block in &self.func.blocks {
self.ctx.phi_inputs_of(block.entry, &mut phi_input_buf);
for param in phi_input_buf.drain(..) {
if !visited.set(param) {
continue;
}
self.append_bundle(param, &mut bundle, &mut use_buf);
}
}
self.res.phi_input_buf = phi_input_buf;
for &inst in &self.func.instrs {
if visited.get(inst) || inst == 0 {
continue;
}
self.append_bundle(inst, &mut bundle, &mut use_buf);
}
self.res.use_buf = use_buf;
}
fn append_bundle(&mut self, inst: Nid, bundle: &mut Bundle, use_buf: &mut Vec<Nid>) {
let mut dom = self.ctx.idom_of(inst);
if self.ctx.nodes[dom].kind == Kind::Loop && self.ctx.nodes[inst].kind == Kind::Phi {
dom = self.ctx.nodes.idom(dom);
dom = self.ctx.idom_of(dom);
}
self.ctx.uses_of(inst, use_buf);
for uinst in use_buf.drain(..) {
let cursor = self.ctx.use_block(inst, uinst);
self.reverse_cfg_dfs(cursor, dom, |_, n, b| {
let mut range = b.range.clone();
range.start =
range.start.max(self.ctx.instr_of(inst).map_or(0, |n| n + 1) as usize);
range.end = range.end.min(
self.ctx
.instr_of(uinst)
.filter(|_| self.ctx.nodes.loop_depth(dom) == self.ctx.nodes.loop_depth(n))
.map_or(Nid::MAX, |n| n + 1) as usize,
);
bundle.add(range);
});
}
match self.res.bundles.iter_mut().enumerate().find(|(_, b)| !b.overlaps(bundle)) {
Some((i, other)) => {
other.merge(bundle);
bundle.clear();
self.res.node_to_reg[inst as usize] = i as Reg + 1;
}
None => {
self.res.bundles.push(mem::replace(bundle, Bundle::new(self.func.instrs.len())));
self.res.node_to_reg[inst as usize] = self.res.bundles.len() as Reg;
}
}
}
fn reverse_cfg_dfs(
&mut self,
from: Nid,
until: Nid,
mut each: impl FnMut(&mut Self, Nid, &Block),
) {
debug_assert!(self.res.dfs_buf.is_empty());
self.res.dfs_buf.push(from);
self.res.dfs_seem.clear(self.ctx.nodes.values.len());
while let Some(nid) = self.res.dfs_buf.pop() {
each(self, nid, &self.func.blocks[self.ctx.block_of(nid) as usize]);
if nid == until {
continue;
}
match self.ctx.nodes[nid].kind {
Kind::Then | Kind::Else | Kind::Region | Kind::Loop => {
for &n in self.ctx.nodes[nid].inputs.iter() {
let d = self.ctx.idom_of(n);
if self.res.dfs_seem.set(d) {
self.res.dfs_buf.push(d);
}
}
}
Kind::Start => {}
_ => unreachable!(),
}
}
}
}
#[derive(Default)]
pub struct Res {
pub bundles: Vec<Bundle>,
pub node_to_reg: Vec<Reg>,
use_buf: Vec<Nid>,
phi_input_buf: Vec<Nid>,
dfs_buf: Vec<Nid>,
dfs_seem: BitSet,
}
pub struct Bundle {
taken: Vec<bool>,
}
impl Bundle {
fn new(size: usize) -> Self {
Self { taken: vec![false; size] }
}
fn add(&mut self, range: Range<usize>) {
self.taken[range].fill(true);
}
fn overlaps(&self, other: &Self) -> bool {
self.taken.iter().zip(other.taken.iter()).any(|(a, b)| a & b)
}
fn merge(&mut self, other: &Self) {
debug_assert!(!self.overlaps(other));
self.taken.iter_mut().zip(other.taken.iter()).for_each(|(a, b)| *a |= *b);
}
fn clear(&mut self) {
self.taken.fill(false);
}
}
#[derive(Default)]
pub struct Func {
pub blocks: Vec<Block>,
pub instrs: Vec<Nid>,
}
pub struct Block {
pub range: Range<usize>,
pub entry: Nid,
}

File diff suppressed because it is too large Load diff

1383
lang/src/ty.rs Normal file

File diff suppressed because it is too large Load diff

View file

@ -1,20 +1,40 @@
#![expect(dead_code)]
use { use {
alloc::alloc, alloc::alloc,
core::{ core::{
alloc::Layout, alloc::Layout,
fmt::Debug, fmt::Debug,
hint::unreachable_unchecked, hint::unreachable_unchecked,
marker::PhantomData,
mem::MaybeUninit, mem::MaybeUninit,
ops::{Deref, DerefMut, Not}, ops::{Deref, DerefMut, Not, Range},
ptr::Unique, ptr::Unique,
}, },
}; };
fn decide(b: bool, name: &'static str) -> Result<(), &'static str> {
b.then_some(()).ok_or(name)
}
pub fn is_snake_case(str: &str) -> Result<(), &'static str> {
decide(str.bytes().all(|c| matches!(c, b'a'..=b'z' | b'0'..=b'9' | b'_')), "snake_case")
}
pub fn is_pascal_case(str: &str) -> Result<(), &'static str> {
decide(
str.as_bytes()[0].is_ascii_uppercase() && str.bytes().all(|c| c.is_ascii_alphanumeric()),
"PascalCase",
)
}
pub fn is_screaming_case(str: &str) -> Result<(), &'static str> {
decide(str.bytes().all(|c| matches!(c, b'A'..=b'Z' | b'0'..=b'9' | b'_')), "SCREAMING_CASE")
}
type Nid = u16; type Nid = u16;
type BitSetUnit = usize;
pub union BitSet { pub union BitSet {
inline: usize, inline: BitSetUnit,
alloced: Unique<AllocedBitSet>, alloced: Unique<AllocedBitSet>,
} }
@ -58,9 +78,9 @@ impl Default for BitSet {
} }
impl BitSet { impl BitSet {
const FLAG: usize = 1 << (Self::UNIT - 1); const FLAG: BitSetUnit = 1 << (Self::UNIT - 1);
const INLINE_ELEMS: usize = Self::UNIT - 1; const INLINE_ELEMS: usize = Self::UNIT - 1;
const UNIT: usize = core::mem::size_of::<usize>() * 8; pub const UNIT: usize = core::mem::size_of::<BitSetUnit>() * 8;
pub fn with_capacity(len: usize) -> Self { pub fn with_capacity(len: usize) -> Self {
let mut s = Self::default(); let mut s = Self::default();
@ -72,7 +92,7 @@ impl BitSet {
unsafe { self.inline & Self::FLAG != 0 } unsafe { self.inline & Self::FLAG != 0 }
} }
fn data_and_len(&self) -> (&[usize], usize) { fn data_and_len(&self) -> (&[BitSetUnit], usize) {
unsafe { unsafe {
if self.is_inline() { if self.is_inline() {
(core::slice::from_ref(&self.inline), Self::INLINE_ELEMS) (core::slice::from_ref(&self.inline), Self::INLINE_ELEMS)
@ -80,16 +100,16 @@ impl BitSet {
let small_vec = self.alloced.as_ref(); let small_vec = self.alloced.as_ref();
( (
core::slice::from_raw_parts( core::slice::from_raw_parts(
&small_vec.data as *const _ as *const usize, &small_vec.data as *const _ as *const BitSetUnit,
small_vec.cap, small_vec.cap,
), ),
small_vec.cap * core::mem::size_of::<usize>() * 8, small_vec.cap * Self::UNIT,
) )
} }
} }
} }
fn data_mut_and_len(&mut self) -> (&mut [usize], usize) { fn data_mut_and_len(&mut self) -> (&mut [BitSetUnit], usize) {
unsafe { unsafe {
if self.is_inline() { if self.is_inline() {
(core::slice::from_mut(&mut self.inline), INLINE_ELEMS) (core::slice::from_mut(&mut self.inline), INLINE_ELEMS)
@ -97,7 +117,7 @@ impl BitSet {
let small_vec = self.alloced.as_mut(); let small_vec = self.alloced.as_mut();
( (
core::slice::from_raw_parts_mut( core::slice::from_raw_parts_mut(
&mut small_vec.data as *mut _ as *mut usize, &mut small_vec.data as *mut _ as *mut BitSetUnit,
small_vec.cap, small_vec.cap,
), ),
small_vec.cap * Self::UNIT, small_vec.cap * Self::UNIT,
@ -124,11 +144,12 @@ impl BitSet {
let index = index as usize; let index = index as usize;
let (mut data, len) = self.data_mut_and_len(); let (mut data, len) = self.data_mut_and_len();
if core::intrinsics::unlikely(index >= len) { if core::intrinsics::unlikely(index >= len) {
self.grow(index.next_power_of_two().max(4 * Self::UNIT)); self.grow((index + 1).next_power_of_two().max(4 * Self::UNIT));
(data, _) = self.data_mut_and_len(); (data, _) = self.data_mut_and_len();
} }
let (elem, bit) = Self::indexes(index); let (elem, bit) = Self::indexes(index);
debug_assert!(elem < data.len(), "{} < {}", elem, data.len());
let elem = unsafe { data.get_unchecked_mut(elem) }; let elem = unsafe { data.get_unchecked_mut(elem) };
let prev = *elem; let prev = *elem;
*elem |= 1 << bit; *elem |= 1 << bit;
@ -142,7 +163,7 @@ impl BitSet {
let (ptr, prev_len) = unsafe { let (ptr, prev_len) = unsafe {
if self.is_inline() { if self.is_inline() {
let ptr = alloc::alloc(layout); let ptr = alloc::alloc(layout);
*ptr.add(off).cast::<usize>() = self.inline & !Self::FLAG; *ptr.add(off).cast::<BitSetUnit>() = self.inline & !Self::FLAG;
(ptr, 1) (ptr, 1)
} else { } else {
let prev_len = self.alloced.as_ref().cap; let prev_len = self.alloced.as_ref().cap;
@ -153,7 +174,7 @@ impl BitSet {
unsafe { unsafe {
MaybeUninit::fill( MaybeUninit::fill(
core::slice::from_raw_parts_mut( core::slice::from_raw_parts_mut(
ptr.add(off).cast::<MaybeUninit<usize>>().add(prev_len), ptr.add(off).cast::<MaybeUninit<BitSetUnit>>().add(prev_len),
slot_count - prev_len, slot_count - prev_len,
), ),
0, 0,
@ -166,7 +187,7 @@ impl BitSet {
fn layout(slot_count: usize) -> (core::alloc::Layout, usize) { fn layout(slot_count: usize) -> (core::alloc::Layout, usize) {
unsafe { unsafe {
core::alloc::Layout::new::<AllocedBitSet>() core::alloc::Layout::new::<AllocedBitSet>()
.extend(Layout::array::<usize>(slot_count).unwrap_unchecked()) .extend(Layout::array::<BitSetUnit>(slot_count).unwrap_unchecked())
.unwrap_unchecked() .unwrap_unchecked()
} }
} }
@ -184,6 +205,10 @@ impl BitSet {
pub fn clear(&mut self, len: usize) { pub fn clear(&mut self, len: usize) {
self.reserve(len); self.reserve(len);
self.clear_as_is();
}
pub fn clear_as_is(&mut self) {
if self.is_inline() { if self.is_inline() {
unsafe { self.inline &= Self::FLAG }; unsafe { self.inline &= Self::FLAG };
} else { } else {
@ -191,7 +216,11 @@ impl BitSet {
} }
} }
pub fn units<'a>(&'a self, slot: &'a mut usize) -> &'a [usize] { pub fn approx_unit_cap(&self) -> usize {
self.data_and_len().0.len()
}
pub fn units<'a>(&'a self, slot: &'a mut BitSetUnit) -> &'a [BitSetUnit] {
if self.is_inline() { if self.is_inline() {
*slot = unsafe { self.inline } & !Self::FLAG; *slot = unsafe { self.inline } & !Self::FLAG;
core::slice::from_ref(slot) core::slice::from_ref(slot)
@ -200,36 +229,47 @@ impl BitSet {
} }
} }
pub fn units_mut(&mut self) -> Option<&mut [BitSetUnit]> {
self.is_inline().not().then(|| self.data_mut_and_len().0)
}
pub fn reserve(&mut self, len: usize) { pub fn reserve(&mut self, len: usize) {
if len > self.data_and_len().1 { if len > self.data_and_len().1 {
self.grow(len.next_power_of_two().max(4 * Self::UNIT)); self.grow(len.next_power_of_two().max(4 * Self::UNIT));
} }
} }
pub fn units_mut(&mut self) -> Result<&mut [usize], &mut InlineBitSetView> { pub fn set_range(&mut self, proj_range: Range<usize>) {
if self.is_inline() { if proj_range.is_empty() {
Err(unsafe { return;
core::mem::transmute::<&mut usize, &mut InlineBitSetView>(&mut self.inline) }
})
self.reserve(proj_range.end);
let (units, _) = self.data_mut_and_len();
if proj_range.start / Self::UNIT == (proj_range.end - 1) / Self::UNIT {
debug_assert!(proj_range.len() <= Self::UNIT);
let mask = ((1 << proj_range.len()) - 1) << (proj_range.start % Self::UNIT);
units[proj_range.start / Self::UNIT] |= mask;
} else { } else {
Ok(self.data_mut_and_len().0) let fill_range = proj_range.start.div_ceil(Self::UNIT)..proj_range.end / Self::UNIT;
} units[fill_range].fill(BitSetUnit::MAX);
}
}
pub struct InlineBitSetView(usize); let prefix_len = Self::UNIT - proj_range.start % Self::UNIT;
let prefix_mask = ((1 << prefix_len) - 1) << (proj_range.start % Self::UNIT);
units[proj_range.start / Self::UNIT] |= prefix_mask;
impl InlineBitSetView { let postfix_len = proj_range.end % Self::UNIT;
pub(crate) fn add_mask(&mut self, tmp: usize) { let postfix_mask = (1 << postfix_len) - 1;
debug_assert!(tmp & BitSet::FLAG == 0); units[proj_range.end / Self::UNIT] |= postfix_mask;
self.0 |= tmp; }
} }
} }
pub struct BitSetIter<'a> { pub struct BitSetIter<'a> {
index: usize, index: usize,
current: usize, current: BitSetUnit,
remining: &'a [usize], remining: &'a [BitSetUnit],
} }
impl Iterator for BitSetIter<'_> { impl Iterator for BitSetIter<'_> {
@ -249,7 +289,7 @@ impl Iterator for BitSetIter<'_> {
struct AllocedBitSet { struct AllocedBitSet {
cap: usize, cap: usize,
data: [usize; 0], data: [BitSetUnit; 0],
} }
#[cfg(test)] #[cfg(test)]
@ -360,9 +400,8 @@ impl Vc {
} }
pub fn push(&mut self, value: Nid) { pub fn push(&mut self, value: Nid) {
if let Some(layout) = self.layout() if let Some(layout) = self.layout() {
&& unsafe { self.alloced.len == self.alloced.cap } if unsafe { self.alloced.len == self.alloced.cap } {
{
unsafe { unsafe {
self.alloced.cap *= 2; self.alloced.cap *= 2;
self.alloced.base = Unique::new_unchecked( self.alloced.base = Unique::new_unchecked(
@ -374,11 +413,13 @@ impl Vc {
.cast(), .cast(),
); );
} }
}
} else if self.len() == INLINE_ELEMS { } else if self.len() == INLINE_ELEMS {
unsafe { unsafe {
let mut allcd = let mut allcd =
Self::alloc((self.inline.cap + 1).next_power_of_two() as _, self.len()); Self::alloc((self.inline.cap + 1).next_power_of_two() as _, self.len());
core::ptr::copy_nonoverlapping(self.as_ptr(), allcd.as_mut_ptr(), self.len()); core::ptr::copy_nonoverlapping(self.as_ptr(), allcd.as_mut_ptr(), self.len());
debug_assert!(!allcd.is_inline());
*self = allcd; *self = allcd;
} }
} }
@ -532,3 +573,117 @@ struct AllocedVc {
len: Nid, len: Nid,
base: Unique<Nid>, base: Unique<Nid>,
} }
pub trait Ent: Copy {
fn new(index: usize) -> Self;
fn index(self) -> usize;
}
#[repr(transparent)]
pub struct EntSlice<K: Ent, T> {
k: PhantomData<fn(K)>,
data: [T],
}
impl<'a, K: Ent, T> From<&'a [T]> for &'a EntSlice<K, T> {
fn from(value: &'a [T]) -> Self {
unsafe { core::mem::transmute(value) }
}
}
impl<K: Ent, T> core::ops::Index<K> for EntSlice<K, T> {
type Output = T;
fn index(&self, index: K) -> &Self::Output {
&self.data[index.index()]
}
}
pub struct EntVec<K: Ent, T> {
data: ::alloc::vec::Vec<T>,
k: PhantomData<fn(K)>,
}
impl<K: Ent, T> Default for EntVec<K, T> {
fn default() -> Self {
Self { data: Default::default(), k: PhantomData }
}
}
impl<K: Ent, T> EntVec<K, T> {
pub fn clear(&mut self) {
self.data.clear();
}
pub fn is_empty(&self) -> bool {
self.data.is_empty()
}
pub fn len(&self) -> usize {
self.data.len()
}
pub fn push(&mut self, value: T) -> K {
let k = K::new(self.data.len());
self.data.push(value);
k
}
pub fn next(&self, index: K) -> Option<&T> {
self.data.get(index.index() + 1)
}
pub fn shadow(&mut self, len: usize)
where
T: Default,
{
if self.data.len() < len {
self.data.resize_with(len, Default::default);
}
}
pub fn iter(&self) -> core::slice::Iter<T> {
self.data.iter()
}
}
impl<K: Ent, T> core::ops::Index<K> for EntVec<K, T> {
type Output = T;
fn index(&self, index: K) -> &Self::Output {
&self.data[index.index()]
}
}
impl<K: Ent, T> core::ops::IndexMut<K> for EntVec<K, T> {
fn index_mut(&mut self, index: K) -> &mut Self::Output {
&mut self.data[index.index()]
}
}
macro_rules! decl_ent {
($(
$vis:vis struct $name:ident($index:ty);
)*) => {$(
#[derive(Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Debug)]
$vis struct $name($index);
impl crate::utils::Ent for $name {
fn new(index: usize) -> Self {
Self(index as $index)
}
fn index(self) -> usize {
self.0 as _
}
}
impl core::fmt::Display for $name {
fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result {
write!(f, concat!(stringify!($name), "{}"), self.0)
}
}
)*};
}
pub(crate) use decl_ent;

View file

@ -4,41 +4,44 @@ main:
LI32 r32, 1148846080w LI32 r32, 1148846080w
CP r2, r32 CP r2, r32
JAL r31, r0, :sin JAL r31, r0, :sin
FMUL32 r33, r1, r32 CP r33, r1
FTI32 r1, r33, 1b FMUL32 r32, r33, r32
FTI32 r32, r32, 1b
CP r1, r32
LD r31, r254, 0a, 24h LD r31, r254, 0a, 24h
ADDI64 r254, r254, 24d ADDI64 r254, r254, 24d
JALA r0, r31, 0a JALA r0, r31, 0a
sin: sin:
LI32 r4, 1124073472w CP r13, r2
LI32 r5, 1078530011w LI32 r14, 1124073472w
FMUL32 r7, r2, r4 LI32 r15, 1078530011w
FDIV32 r9, r7, r5 FMUL32 r14, r13, r14
FTI32 r11, r9, 1b FDIV32 r14, r14, r15
ANDI r10, r11, 255d FTI32 r14, r14, 1b
ITF64 r5, r11 ANDI r15, r14, 255d
MULI64 r4, r10, 4d MULI64 r15, r15, 4d
LRA r3, r0, :SIN_TABLE LRA r16, r0, :sin_table
LI32 r7, 1086918619w LI32 r17, 1086918619w
FC64T32 r9, r5, 1b ITF32 r18, r14
ADDI64 r5, r11, 64d ADDI64 r14, r14, 64d
ADD64 r8, r3, r4 ADD64 r15, r16, r15
LI32 r1, 1132462080w LI32 r19, 1132462080w
FMUL32 r6, r9, r7 FMUL32 r17, r18, r17
ANDI r7, r5, 255d ANDI r14, r14, 255d
LI32 r5, 1056964608w LI32 r18, 1056964608w
LD r4, r8, 0a, 4h LD r15, r15, 0a, 4h
FDIV32 r8, r6, r1 FDIV32 r17, r17, r19
MULI64 r6, r7, 4d MULI64 r14, r14, 4d
FMUL32 r10, r4, r5 FMUL32 r18, r15, r18
FSUB32 r11, r2, r8 FSUB32 r13, r13, r17
ADD64 r9, r3, r6 ADD64 r14, r16, r14
FMUL32 r2, r11, r10 FMUL32 r16, r13, r18
LD r12, r9, 0a, 4h LD r14, r14, 0a, 4h
FSUB32 r5, r12, r2 FSUB32 r14, r14, r16
FMUL32 r7, r5, r11 FMUL32 r13, r14, r13
FADD32 r1, r4, r7 FADD32 r13, r15, r13
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 1303 code size: 1311
ret: 826 ret: 826
status: Ok(()) status: Ok(())

View file

@ -1,6 +1,6 @@
main: main:
LI64 r1, 0d CP r1, r0
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 29 code size: 22
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -1,6 +1,6 @@
main: main:
LI64 r1, 0d CP r1, r0
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 29 code size: 22
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -2,26 +2,31 @@ main:
ADDI64 r254, r254, -56d ADDI64 r254, r254, -56d
ST r31, r254, 24a, 32h ST r31, r254, 24a, 32h
LI64 r32, 1d LI64 r32, 1d
ADDI64 r2, r254, 0d ADDI64 r33, r254, 0d
ST r32, r254, 0a, 8h ST r32, r254, 0a, 8h
LI64 r33, 2d LI64 r34, 2d
ST r33, r254, 8a, 8h ST r34, r254, 8a, 8h
LI64 r34, 4d LI64 r34, 4d
ST r34, r254, 16a, 8h ST r34, r254, 16a, 8h
CP r2, r33
JAL r31, r0, :pass JAL r31, r0, :pass
ADD64 r1, r1, r32 CP r33, r1
ADD64 r32, r33, r32
CP r1, r32
LD r31, r254, 24a, 32h LD r31, r254, 24a, 32h
ADDI64 r254, r254, 56d ADDI64 r254, r254, 56d
JALA r0, r31, 0a JALA r0, r31, 0a
pass: pass:
LD r4, r2, 8a, 8h CP r13, r2
MULI64 r7, r4, 8d LD r14, r13, 8a, 8h
LD r5, r2, 0a, 8h MULI64 r15, r14, 8d
ADD64 r10, r7, r2 LD r16, r13, 0a, 8h
ADD64 r9, r4, r5 ADD64 r13, r15, r13
LD r1, r10, 0a, 8h ADD64 r14, r14, r16
ADD64 r1, r1, r9 LD r13, r13, 0a, 8h
ADD64 r13, r13, r14
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 231 code size: 246
ret: 8 ret: 8
status: Ok(()) status: Ok(())

View file

@ -1,7 +1,8 @@
main: main:
LRA r1, r0, :SIN_TABLE LRA r13, r0, :sin_table
LD r1, r1, 80a, 8h LD r13, r13, 80a, 8h
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 767 code size: 770
ret: 1736 ret: 1736
status: Ok(()) status: Ok(())

View file

@ -1,13 +1,14 @@
main: main:
LI64 r1, 1d CP r14, r2
JNE r2, r1, :0 LI64 r13, 1d
JNE r14, r13, :0
JMP :1 JMP :1
0: LI64 r7, 0d 0: JNE r14, r0, :2
JNE r2, r7, :2 LI64 r13, 2d
LI64 r1, 2d
JMP :1 JMP :1
2: LI64 r1, 3d 2: LI64 r13, 3d
1: JALA r0, r31, 0a 1: CP r1, r13
code size: 79 JALA r0, r31, 0a
code size: 75
ret: 2 ret: 2
status: Ok(()) status: Ok(())

View file

@ -1,25 +1,30 @@
main: main:
ADDI64 r254, r254, -16d ADDI64 r254, r254, -24d
ST r31, r254, 0a, 16h ST r31, r254, 0a, 24h
LRA r2, r0, :"abඞ\n\r\t56789\0" LRA r32, r0, :"abඞ\n\r\t56789\0"
CP r2, r32
JAL r31, r0, :str_len JAL r31, r0, :str_len
CP r32, r1 CP r32, r1
LRA r2, r0, :"fff\0" LRA r33, r0, :"fff\0"
CP r2, r33
JAL r31, r0, :str_len JAL r31, r0, :str_len
ADD64 r1, r1, r32 CP r33, r1
LD r31, r254, 0a, 16h ADD64 r32, r33, r32
ADDI64 r254, r254, 16d CP r1, r32
LD r31, r254, 0a, 24h
ADDI64 r254, r254, 24d
JALA r0, r31, 0a JALA r0, r31, 0a
str_len: str_len:
LI8 r6, 0b CP r13, r2
LI64 r1, 0d CP r15, r0
2: LD r8, r2, 0a, 1h CP r14, r15
ANDI r8, r8, 255d 2: LD r16, r13, 0a, 1h
ANDI r6, r6, 255d ANDI r16, r16, 255d
JNE r8, r6, :0 JNE r16, r15, :0
CP r1, r14
JMP :1 JMP :1
0: ADDI64 r2, r2, 1d 0: ADDI64 r13, r13, 1d
ADDI64 r1, r1, 1d ADDI64 r14, r14, 1d
JMP :2 JMP :2
1: JALA r0, r31, 0a 1: JALA r0, r31, 0a
code size: 216 code size: 216

View file

@ -4,10 +4,10 @@ main:
ADDI64 r254, r254, -8d ADDI64 r254, r254, -8d
ST r31, r254, 0a, 8h ST r31, r254, 0a, 8h
JAL r31, r0, :foo JAL r31, r0, :foo
LI64 r1, 0d CP r1, r0
LD r31, r254, 0a, 8h LD r31, r254, 0a, 8h
ADDI64 r254, r254, 8d ADDI64 r254, r254, 8d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 95 code size: 88
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -0,0 +1,65 @@
box:
CP r13, r2
CP r1, r13
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -32d
ST r31, r254, 0a, 32h
LI32 r32, 1065353216w
CP r2, r32
JAL r31, r0, :box
CP r33, r1
CP r2, r0
JAL r31, r0, :box
CP r34, r1
FCMPLT32 r33, r33, r34
ANDI r33, r33, 255d
JNE r33, r0, :0
CP r2, r32
JAL r31, r0, :box
CP r33, r1
CP r2, r0
JAL r31, r0, :box
CP r34, r1
FCMPGT32 r33, r33, r34
NOT r33, r33
ANDI r33, r33, 255d
JNE r33, r0, :1
CP r2, r0
JAL r31, r0, :box
CP r33, r1
CP r2, r32
JAL r31, r0, :box
CP r34, r1
FCMPGT32 r33, r33, r34
ANDI r33, r33, 255d
JNE r33, r0, :2
CP r2, r0
JAL r31, r0, :box
CP r33, r1
CP r2, r32
JAL r31, r0, :box
CP r32, r1
FCMPLT32 r32, r33, r32
NOT r32, r32
ANDI r32, r32, 255d
JNE r32, r0, :3
CP r1, r0
JMP :4
3: LI64 r32, 4d
CP r1, r32
JMP :4
2: LI64 r32, 3d
CP r1, r32
JMP :4
1: LI64 r32, 2d
CP r1, r32
JMP :4
0: LI64 r32, 1d
CP r1, r32
4: LD r31, r254, 0a, 32h
ADDI64 r254, r254, 32d
JALA r0, r31, 0a
code size: 355
ret: 0
status: Ok(())

View file

@ -1,7 +1,8 @@
main: main:
LRA r1, r0, :a LRA r13, r0, :a
LD r1, r1, 0a, 8h LD r13, r13, 0a, 8h
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 47 code size: 50
ret: 50 ret: 50
status: Ok(()) status: Ok(())

View file

@ -1,7 +1,8 @@
main: main:
LRA r1, r0, :a LRA r13, r0, :a
LD r1, r1, 0a, 8h LD r13, r13, 0a, 8h
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 47 code size: 50
ret: 50 ret: 50
status: Ok(()) status: Ok(())

View file

@ -1,20 +1,19 @@
cond: cond:
LI64 r1, 0d CP r1, r0
JALA r0, r31, 0a JALA r0, r31, 0a
main: main:
ADDI64 r254, r254, -24d ADDI64 r254, r254, -24d
ST r31, r254, 0a, 24h ST r31, r254, 0a, 24h
JAL r31, r0, :cond JAL r31, r0, :cond
LI64 r32, 0d CP r32, r0
CP r33, r32 CP r33, r1
JNE r1, r33, :0 JNE r33, r32, :0
CP r32, r33
CP r1, r32
JMP :1 JMP :1
0: LI64 r1, 2d 0: LI64 r32, 2d
1: LD r31, r254, 0a, 24h 1: CP r1, r32
LD r31, r254, 0a, 24h
ADDI64 r254, r254, 24d ADDI64 r254, r254, 24d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 134 code size: 117
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -1,6 +1,6 @@
main: main:
LI64 r1, 0d CP r1, r0
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 29 code size: 22
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -0,0 +1,7 @@
main:
LI32 r13, 69w
CP r1, r13
JALA r0, r31, 0a
code size: 28
ret: 69
status: Ok(())

View file

@ -1,6 +1,6 @@
main: main:
LI64 r1, 0d CP r1, r0
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 29 code size: 22
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -0,0 +1,19 @@
main:
LI64 r15, 3d
LI64 r16, 10d
CP r14, r0
CP r13, r14
3: JNE r13, r16, :0
LI64 r14, -10d
ADD64 r14, r13, r14
CP r1, r14
JMP :1
0: DIRU64 r0, r17, r13, r15
JNE r17, r14, :2
JMP :2
2: ADDI64 r13, r13, 1d
JMP :3
1: JALA r0, r31, 0a
code size: 103
ret: 0
status: Ok(())

View file

@ -1,5 +1,11 @@
main: fun:
UN UN
code size: 9 main:
ADDI64 r254, r254, -8d
ST r31, r254, 0a, 8h
JAL r31, r0, :fun
LD r31, r254, 0a, 8h
ADDI64 r254, r254, 8d
code size: 64
ret: 0 ret: 0
status: Err(Unreachable) status: Err(Unreachable)

View file

@ -0,0 +1,72 @@
main:
ADDI64 r254, r254, -88d
ST r31, r254, 48a, 40h
LRA r32, r0, :glob_stru
JAL r31, r0, :new_stru
ST r1, r32, 0a, 16h
LD r33, r32, 0a, 8h
JEQ r33, r0, :0
LI64 r32, 300d
CP r1, r32
JMP :1
0: ST r0, r32, 0a, 8h
LD r33, r32, 0a, 8h
JEQ r33, r0, :2
ST r0, r32, 8a, 8h
LI64 r32, 200d
CP r1, r32
JMP :1
2: LI64 r34, 1d
ST r34, r32, 0a, 8h
ST r34, r32, 8a, 8h
ADDI64 r33, r254, 0d
ST r34, r254, 0a, 8h
ST r34, r254, 8a, 8h
ST r34, r254, 16a, 8h
ST r34, r254, 24a, 8h
ST r34, r254, 32a, 8h
ST r34, r254, 40a, 8h
ADDI64 r35, r33, 48d
CP r32, r33
8: JNE r35, r32, :3
LD r32, r254, 32a, 8h
JEQ r32, r0, :4
LI64 r32, 100d
CP r1, r32
JMP :1
4: ST r34, r254, 0a, 8h
ST r34, r254, 8a, 8h
ST r34, r254, 16a, 8h
ST r34, r254, 24a, 8h
ST r34, r254, 32a, 8h
ST r34, r254, 40a, 8h
CP r32, r33
7: LD r34, r254, 32a, 8h
JNE r35, r32, :5
JEQ r34, r0, :6
LI64 r32, 10d
CP r1, r32
JMP :1
6: CP r1, r0
JMP :1
5: ST r0, r32, 0a, 8h
ST r0, r32, 8a, 8h
ADDI64 r32, r32, 16d
JMP :7
3: JAL r31, r0, :new_stru
ST r1, r32, 0a, 16h
ADDI64 r32, r32, 16d
JMP :8
1: LD r31, r254, 48a, 40h
ADDI64 r254, r254, 88d
JALA r0, r31, 0a
new_stru:
ADDI64 r254, r254, -16d
ST r0, r254, 0a, 8h
ST r0, r254, 8a, 8h
LD r1, r254, 0a, 16h
ADDI64 r254, r254, 16d
JALA r0, r31, 0a
code size: 668
ret: 0
status: Ok(())

View file

@ -1,30 +1,29 @@
main: main:
ADDI64 r254, r254, -12d ADDI64 r254, r254, -12d
LI8 r1, 255b LI8 r13, 255b
ST r1, r254, 0a, 1h ST r13, r254, 0a, 1h
LI8 r4, 0b ST r0, r254, 1a, 1h
ST r4, r254, 1a, 1h ST r0, r254, 2a, 1h
ST r4, r254, 2a, 1h ST r13, r254, 3a, 1h
ST r1, r254, 3a, 1h ST r0, r254, 4a, 4h
LI32 r9, 0w LD r13, r254, 4a, 4h
ST r9, r254, 4a, 4h LI32 r14, 2w
LI32 r12, 2w ST r14, r254, 8a, 4h
ST r12, r254, 8a, 4h LD r14, r254, 8a, 4h
LD r3, r254, 8a, 4h LI64 r15, 2d
ANDI r3, r3, 4294967295d ANDI r14, r14, 4294967295d
ANDI r12, r12, 4294967295d JEQ r14, r15, :0
JEQ r3, r12, :0 CP r1, r0
LI64 r1, 0d
JMP :1 JMP :1
0: LD r10, r254, 4a, 4h 0: ANDI r13, r13, 4294967295d
ANDI r10, r10, 4294967295d JEQ r13, r0, :2
ANDI r9, r9, 4294967295d LI64 r13, 64d
JEQ r10, r9, :2 CP r1, r13
LI64 r1, 64d
JMP :1 JMP :1
2: LI64 r1, 512d 2: LI64 r13, 512d
CP r1, r13
1: ADDI64 r254, r254, 12d 1: ADDI64 r254, r254, 12d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 257 code size: 235
ret: 512 ret: 512
status: Ok(()) status: Ok(())

View file

@ -1,20 +1,21 @@
main: main:
ADDI64 r254, r254, -16d ADDI64 r254, r254, -16d
LI64 r1, 10d LI64 r13, 10d
ADDI64 r4, r254, 0d ST r13, r254, 0a, 8h
ST r1, r254, 0a, 8h LI64 r13, 20d
LI64 r7, 20d ST r13, r254, 8a, 8h
ST r7, r254, 8a, 8h LI64 r13, 6d
LI64 r6, 6d LI64 r14, 5d
LI64 r5, 5d LI64 r15, 1d
LI64 r2, 1d CP r2, r15
CP r3, r4 LD r3, r254, 0a, 16h
LD r3, r3, 0a, 16h CP r5, r14
CP r6, r13
ECA ECA
LI64 r1, 0d CP r1, r0
ADDI64 r254, r254, 16d ADDI64 r254, r254, 16d
JALA r0, r31, 0a JALA r0, r31, 0a
ev: Ecall ev: Ecall
code size: 155 code size: 143
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -0,0 +1,28 @@
main:
ADDI64 r254, r254, -40d
ST r31, r254, 24a, 16h
LI64 r32, 1d
ST r32, r254, 0a, 8h
ST r0, r254, 8a, 8h
ST r0, r254, 16a, 8h
LD r2, r254, 8a, 16h
JAL r31, r0, :pass
CP r32, r1
CP r1, r32
LD r31, r254, 24a, 16h
ADDI64 r254, r254, 40d
JALA r0, r31, 0a
pass:
ADDI64 r254, r254, -16d
ST r2, r254, 0a, 16h
ADDI64 r2, r254, 0d
CP r13, r2
LD r14, r13, 0a, 8h
LD r13, r13, 8a, 8h
ADD64 r13, r13, r14
CP r1, r13
ADDI64 r254, r254, 16d
JALA r0, r31, 0a
code size: 235
ret: 0
status: Ok(())

View file

@ -0,0 +1,20 @@
main:
ADDI64 r254, r254, -16d
ST r31, r254, 0a, 16h
JAL r31, r0, :some_enum
CP r32, r1
ANDI r32, r32, 255d
JNE r32, r0, :0
CP r1, r0
JMP :1
0: LI64 r32, 100d
CP r1, r32
1: LD r31, r254, 0a, 16h
ADDI64 r254, r254, 16d
JALA r0, r31, 0a
some_enum:
CP r1, r0
JALA r0, r31, 0a
code size: 128
ret: 0
status: Ok(())

View file

@ -1,105 +1,114 @@
continue_and_state_change: continue_and_state_change:
LI64 r7, 3d CP r13, r2
LI64 r8, 4d LI64 r16, 3d
LI64 r9, 2d LI64 r17, 2d
LI64 r10, 10d LI64 r18, 10d
6: JLTU r2, r10, :0 CP r15, r0
CP r1, r2 LI64 r14, 4d
6: JLTU r13, r18, :0
JMP :1 JMP :1
0: JNE r2, r9, :2 0: JNE r13, r17, :2
CP r2, r8 CP r13, r14
JMP :3 JMP :3
2: JNE r2, r7, :4 2: JNE r13, r16, :4
LI64 r1, 0d CP r13, r15
1: JMP :5 1: CP r1, r13
4: ADDI64 r2, r2, 1d JMP :5
4: ADDI64 r13, r13, 1d
3: JMP :6 3: JMP :6
5: JALA r0, r31, 0a 5: JALA r0, r31, 0a
infinite_loop: infinite_loop:
ADDI64 r254, r254, -24d ADDI64 r254, r254, -32d
ST r31, r254, 0a, 24h ST r31, r254, 0a, 32h
LI64 r32, 1d LI64 r34, 1d
LI64 r33, 0d CP r33, r0
CP r1, r33 CP r32, r33
1: JNE r1, r32, :0 1: JNE r32, r34, :0
JMP :0 JMP :0
0: CP r2, r33 0: CP r2, r33
JAL r31, r0, :continue_and_state_change JAL r31, r0, :continue_and_state_change
CP r32, r1
JMP :1 JMP :1
LD r31, r254, 0a, 24h LD r31, r254, 0a, 32h
ADDI64 r254, r254, 24d ADDI64 r254, r254, 32d
JALA r0, r31, 0a JALA r0, r31, 0a
main: main:
ADDI64 r254, r254, -64d ADDI64 r254, r254, -40d
ST r31, r254, 0a, 64h ST r31, r254, 0a, 40h
LI64 r32, 0d CP r2, r0
CP r2, r32
JAL r31, r0, :multiple_breaks JAL r31, r0, :multiple_breaks
LI64 r32, 3d
CP r33, r1 CP r33, r1
LI64 r1, 3d JEQ r33, r32, :0
JEQ r33, r1, :0 LI64 r32, 1d
LI64 r1, 1d CP r1, r32
JMP :1 JMP :1
0: CP r34, r1 0: LI64 r33, 4d
LI64 r35, 4d CP r2, r33
CP r2, r35
JAL r31, r0, :multiple_breaks JAL r31, r0, :multiple_breaks
CP r36, r35 LI64 r34, 10d
LI64 r37, 10d CP r35, r1
JEQ r1, r37, :2 JEQ r35, r34, :2
LI64 r1, 2d LI64 r32, 2d
CP r1, r32
JMP :1 JMP :1
2: CP r2, r32 2: CP r2, r0
JAL r31, r0, :state_change_in_break JAL r31, r0, :state_change_in_break
JEQ r1, r32, :3 CP r35, r1
CP r1, r34 JEQ r35, r0, :3
CP r1, r32
JMP :1 JMP :1
3: CP r2, r36 3: CP r2, r33
JAL r31, r0, :state_change_in_break JAL r31, r0, :state_change_in_break
JEQ r1, r37, :4 CP r35, r1
CP r1, r36 JEQ r35, r34, :4
CP r1, r33
JMP :1 JMP :1
4: CP r2, r37 4: CP r2, r34
JAL r31, r0, :continue_and_state_change JAL r31, r0, :continue_and_state_change
JEQ r1, r37, :5 CP r33, r1
LI64 r1, 5d JEQ r33, r34, :5
LI64 r32, 5d
CP r1, r32
JMP :1 JMP :1
5: CP r2, r34 5: CP r2, r32
JAL r31, r0, :continue_and_state_change JAL r31, r0, :continue_and_state_change
JEQ r1, r32, :6 CP r32, r1
LI64 r1, 6d JEQ r32, r0, :6
LI64 r32, 6d
CP r1, r32
JMP :1 JMP :1
6: CP r38, r32 6: JAL r31, r0, :infinite_loop
JAL r31, r0, :infinite_loop CP r1, r0
CP r1, r38 1: LD r31, r254, 0a, 40h
1: LD r31, r254, 0a, 64h ADDI64 r254, r254, 40d
ADDI64 r254, r254, 64d
JALA r0, r31, 0a JALA r0, r31, 0a
multiple_breaks: multiple_breaks:
LI64 r6, 3d CP r13, r2
LI64 r5, 10d LI64 r14, 3d
4: JLTU r2, r5, :0 LI64 r15, 10d
CP r1, r2 4: JLTU r13, r15, :0
JMP :1 JMP :1
0: ADDI64 r1, r2, 1d 0: ADDI64 r13, r13, 1d
JNE r1, r6, :2 JNE r13, r14, :2
1: JMP :3 1: CP r1, r13
2: CP r2, r1 JMP :3
JMP :4 2: JMP :4
3: JALA r0, r31, 0a 3: JALA r0, r31, 0a
state_change_in_break: state_change_in_break:
LI64 r5, 3d CP r13, r2
LI64 r6, 10d LI64 r14, 3d
4: JLTU r2, r6, :0 LI64 r15, 10d
CP r1, r2 4: JLTU r13, r15, :0
JMP :1 JMP :1
0: JNE r2, r5, :2 0: JNE r13, r14, :2
LI64 r1, 0d CP r13, r0
1: JMP :3 1: CP r1, r13
2: ADDI64 r2, r2, 1d JMP :3
2: ADDI64 r13, r13, 1d
JMP :4 JMP :4
3: JALA r0, r31, 0a 3: JALA r0, r31, 0a
timed out timed out
code size: 668 code size: 667
ret: 10 ret: 10
status: Ok(()) status: Ok(())

View file

@ -1,53 +1,55 @@
check_platform: check_platform:
ADDI64 r254, r254, -8d ADDI64 r254, r254, -16d
ST r31, r254, 0a, 8h ST r31, r254, 0a, 16h
JAL r31, r0, :x86_fb_ptr JAL r31, r0, :x86_fb_ptr
LD r31, r254, 0a, 8h CP r32, r1
ADDI64 r254, r254, 8d CP r1, r32
LD r31, r254, 0a, 16h
ADDI64 r254, r254, 16d
JALA r0, r31, 0a JALA r0, r31, 0a
main: main:
ADDI64 r254, r254, -64d ADDI64 r254, r254, -56d
ST r31, r254, 0a, 64h ST r31, r254, 0a, 56h
JAL r31, r0, :check_platform JAL r31, r0, :check_platform
LI64 r32, 0d CP r35, r0
LI64 r33, 30d LI64 r36, 30d
LI64 r34, 100d LI64 r37, 100d
CP r35, r32 CP r34, r35
CP r36, r32 CP r32, r35
CP r37, r32 CP r33, r35
5: JLTU r35, r33, :0 5: JLTU r34, r36, :0
ADDI64 r36, r36, 1d ADDI64 r32, r32, 1d
CP r2, r32 CP r2, r35
CP r3, r36 CP r3, r32
CP r4, r33 CP r4, r36
JAL r31, r0, :set_pixel JAL r31, r0, :set_pixel
JEQ r1, r37, :1 CP r34, r1
CP r1, r32 JEQ r34, r33, :1
CP r1, r35
JMP :2 JMP :2
1: CP r38, r32 1: JNE r32, r37, :3
JNE r36, r34, :3 CP r1, r33
CP r1, r37
JMP :2 JMP :2
3: CP r1, r37 3: CP r34, r35
CP r35, r38
JMP :4 JMP :4
0: CP r1, r37 0: ADDI64 r33, r33, 1d
CP r38, r32 ADDI64 r34, r34, 1d
ADDI64 r1, r1, 1d 4: JMP :5
ADDI64 r35, r35, 1d 2: LD r31, r254, 0a, 56h
4: CP r32, r38 ADDI64 r254, r254, 56d
CP r37, r1
JMP :5
2: LD r31, r254, 0a, 64h
ADDI64 r254, r254, 64d
JALA r0, r31, 0a JALA r0, r31, 0a
set_pixel: set_pixel:
MUL64 r7, r3, r4 CP r13, r2
ADD64 r1, r7, r2 CP r14, r3
CP r15, r4
MUL64 r14, r14, r15
ADD64 r13, r14, r13
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
x86_fb_ptr: x86_fb_ptr:
LI64 r1, 100d LI64 r13, 100d
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 330 code size: 329
ret: 3000 ret: 3000
status: Ok(()) status: Ok(())

View file

@ -1,6 +1,7 @@
main: main:
LI32 r1, 3212836864w LI32 r13, 3212836864w
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 25 code size: 28
ret: 3212836864 ret: 3212836864
status: Ok(()) status: Ok(())

View file

@ -1,21 +1,29 @@
add_one: add_one:
ADDI64 r1, r2, 1d CP r13, r2
ADDI64 r13, r13, 1d
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
add_two: add_two:
ADDI64 r1, r2, 2d CP r13, r2
ADDI64 r13, r13, 2d
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
main: main:
ADDI64 r254, r254, -16d ADDI64 r254, r254, -24d
ST r31, r254, 0a, 16h ST r31, r254, 0a, 24h
LI64 r2, 10d LI64 r32, 10d
CP r2, r32
JAL r31, r0, :add_one JAL r31, r0, :add_one
CP r32, r1 CP r32, r1
LI64 r2, 20d LI64 r33, 20d
CP r2, r33
JAL r31, r0, :add_two JAL r31, r0, :add_two
ADD64 r1, r1, r32 CP r33, r1
LD r31, r254, 0a, 16h ADD64 r32, r33, r32
ADDI64 r254, r254, 16d CP r1, r32
LD r31, r254, 0a, 24h
ADDI64 r254, r254, 24d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 152 code size: 176
ret: 33 ret: 33
status: Ok(()) status: Ok(())

View file

@ -0,0 +1,21 @@
b:
CP r13, r3
CP r1, r13
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -32d
ST r31, r254, 8a, 24h
ADDI64 r32, r254, 0d
LI64 r33, 100d
ST r33, r254, 0a, 8h
CP r2, r32
CP r3, r33
JAL r31, r0, :b
CP r32, r1
CP r1, r32
LD r31, r254, 8a, 24h
ADDI64 r254, r254, 32d
JALA r0, r31, 0a
code size: 137
ret: 100
status: Ok(())

View file

@ -1,24 +1,38 @@
add: add:
ADD64 r1, r2, r3 CP r13, r2
CP r14, r3
ADD64 r13, r13, r14
CP r1, r13
JALA r0, r31, 0a
add:
CP r13, r2
CP r14, r3
ADD32 r13, r13, r14
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
add: add:
ADD32 r1, r2, r3
JALA r0, r31, 0a JALA r0, r31, 0a
main: main:
ADDI64 r254, r254, -24d ADDI64 r254, r254, -32d
ST r31, r254, 0a, 24h ST r31, r254, 0a, 32h
LI32 r3, 2w JAL r31, r0, :add
CP r2, r3 LI32 r32, 2w
CP r2, r32
CP r3, r32
JAL r31, r0, :add JAL r31, r0, :add
CP r32, r1 CP r32, r1
LI64 r3, 3d LI64 r33, 3d
LI64 r2, 1d LI64 r34, 1d
CP r2, r34
CP r3, r33
JAL r31, r0, :add JAL r31, r0, :add
ANDI r33, r32, 4294967295d CP r33, r1
SUB64 r1, r33, r1 ANDI r32, r32, 4294967295d
LD r31, r254, 0a, 24h SUB64 r32, r32, r33
ADDI64 r254, r254, 24d CP r1, r32
LD r31, r254, 0a, 32h
ADDI64 r254, r254, 32d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 158 code size: 209
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -0,0 +1,32 @@
main:
ADDI64 r254, r254, -8d
ST r31, r254, 0a, 8h
JAL r31, r0, :process
LD r31, r254, 0a, 8h
ADDI64 r254, r254, 8d
JALA r0, r31, 0a
opaque:
JALA r0, r31, 0a
process:
ADDI64 r254, r254, -48d
ST r31, r254, 16a, 32h
ADDI64 r33, r254, 0d
ST r0, r254, 0a, 1h
LI64 r32, 1000d
4: JGTU r32, r0, :0
JMP :1
0: CP r2, r33
JAL r31, r0, :opaque
LD r34, r254, 0a, 1h
ANDI r34, r34, 255d
JEQ r34, r0, :2
JMP :3
2: ADDI64 r32, r32, -1d
1: JMP :4
3: LD r31, r254, 16a, 32h
ADDI64 r254, r254, 48d
JALA r0, r31, 0a
timed out
code size: 248
ret: 0
status: Ok(())

View file

@ -1,127 +1,219 @@
deinit: deinit:
ADDI64 r254, r254, -32d ADDI64 r254, r254, -40d
ST r31, r254, 0a, 32h ST r31, r254, 0a, 40h
CP r32, r2 CP r32, r2
LD r33, r2, 16a, 8h LD r33, r32, 16a, 8h
LI64 r4, 8d LI64 r34, 8d
MUL64 r3, r33, r4 MUL64 r33, r33, r34
CP r34, r32 LD r35, r32, 0a, 8h
LD r2, r34, 0a, 8h CP r2, r35
CP r3, r33
CP r4, r34
JAL r31, r0, :free JAL r31, r0, :free
CP r1, r32 CP r1, r32
JAL r31, r0, :new JAL r31, r0, :new
LD r31, r254, 0a, 32h LD r31, r254, 0a, 40h
ADDI64 r254, r254, 32d ADDI64 r254, r254, 40d
JALA r0, r31, 0a JALA r0, r31, 0a
free: deinit:
CP r10, r2 ADDI64 r254, r254, -40d
LRA r7, r0, :FREE_SYS_CALL ST r31, r254, 0a, 40h
LD r2, r7, 0a, 8h CP r32, r2
CP r5, r4 LI64 r33, 1d
CP r4, r3 LD r34, r32, 16a, 8h
CP r3, r10 LD r35, r32, 0a, 8h
ECA CP r2, r35
JALA r0, r31, 0a CP r3, r34
main: CP r4, r33
ADDI64 r254, r254, -56d JAL r31, r0, :free
ST r31, r254, 24a, 32h
ADDI64 r32, r254, 0d
CP r1, r32 CP r1, r32
JAL r31, r0, :new JAL r31, r0, :new
LI64 r3, 69d LD r31, r254, 0a, 40h
ADDI64 r254, r254, 40d
JALA r0, r31, 0a
free:
CP r13, r2
CP r14, r3
CP r15, r4
LRA r16, r0, :free_sys_call
LD r16, r16, 0a, 8h
CP r2, r16
CP r3, r13
CP r4, r14
CP r5, r15
ECA
CP r13, r1
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -88d
ST r31, r254, 48a, 40h
ADDI64 r32, r254, 24d
CP r1, r32
JAL r31, r0, :new
LI64 r33, 35d
CP r2, r32 CP r2, r32
CP r3, r33
JAL r31, r0, :push JAL r31, r0, :push
LD r33, r254, 0a, 8h ADDI64 r33, r254, 0d
LD r34, r33, 0a, 8h CP r1, r33
JAL r31, r0, :new
LI8 r34, 34b
CP r2, r33
CP r3, r34
JAL r31, r0, :push
LD r34, r254, 0a, 8h
LD r34, r34, 0a, 1h
LD r35, r254, 24a, 8h
LD r35, r35, 0a, 8h
CP r2, r33
JAL r31, r0, :deinit
CP r2, r32 CP r2, r32
JAL r31, r0, :deinit JAL r31, r0, :deinit
CP r1, r34 ANDI r32, r34, 255d
LD r31, r254, 24a, 32h ADD64 r32, r35, r32
ADDI64 r254, r254, 56d CP r1, r32
LD r31, r254, 48a, 40h
ADDI64 r254, r254, 88d
JALA r0, r31, 0a JALA r0, r31, 0a
malloc: malloc:
CP r9, r2 CP r13, r2
LRA r5, r0, :MALLOC_SYS_CALL CP r14, r3
LD r2, r5, 0a, 8h LRA r15, r0, :malloc_sys_call
CP r4, r3 LD r15, r15, 0a, 8h
CP r3, r9 CP r2, r15
CP r3, r13
CP r4, r14
ECA ECA
CP r13, r1
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
new: new:
ADDI64 r254, r254, -24d ADDI64 r254, r254, -24d
LI64 r4, 0d CP r15, r1
ADDI64 r5, r254, 0d LI64 r14, 8d
ST r4, r254, 0a, 8h ADDI64 r13, r254, 0d
ST r4, r254, 8a, 8h ST r14, r254, 0a, 8h
ST r4, r254, 16a, 8h ST r0, r254, 8a, 8h
BMC r5, r1, 24h ST r0, r254, 16a, 8h
BMC r13, r15, 24h
ADDI64 r254, r254, 24d
JALA r0, r31, 0a
new:
ADDI64 r254, r254, -24d
CP r15, r1
LI64 r14, 1d
ADDI64 r13, r254, 0d
ST r14, r254, 0a, 8h
ST r0, r254, 8a, 8h
ST r0, r254, 16a, 8h
BMC r13, r15, 24h
ADDI64 r254, r254, 24d ADDI64 r254, r254, 24d
JALA r0, r31, 0a JALA r0, r31, 0a
push: push:
ADDI64 r254, r254, -192d ADDI64 r254, r254, -80d
ST r31, r254, 0a, 192h ST r31, r254, 0a, 80h
CP r32, r3
LI64 r33, 1d
LD r34, r2, 8a, 8h
LD r35, r2, 16a, 8h
CP r36, r2 CP r36, r2
JNE r35, r34, :0 CP r37, r3
LI64 r37, 0d LI64 r35, 1d
JNE r35, r37, :1 LD r33, r36, 8a, 8h
CP r38, r33 LD r32, r36, 16a, 8h
JNE r32, r33, :0
JNE r32, r0, :1
CP r32, r35
JMP :2 JMP :2
1: MULI64 r38, r35, 2d 1: MULI64 r32, r32, 2d
2: LI64 r39, 8d 2: CP r2, r32
MUL64 r2, r38, r39 CP r3, r35
CP r3, r39
JAL r31, r0, :malloc JAL r31, r0, :malloc
CP r40, r1 ST r32, r36, 16a, 8h
LI64 r1, 0d CP r34, r1
CP r41, r40 JNE r34, r0, :3
JNE r41, r1, :3 CP r1, r0
JMP :4 JMP :4
3: CP r40, r41 3: LD r32, r36, 0a, 8h
CP r42, r36 ADD64 r38, r33, r32
ST r38, r42, 16a, 8h CP r33, r34
LD r36, r42, 8a, 8h 7: LD r39, r36, 0a, 8h
MULI64 r43, r36, 8d LD r40, r36, 8a, 8h
LD r44, r42, 0a, 8h JNE r38, r32, :5
ADD64 r45, r44, r43 JEQ r40, r0, :6
CP r46, r40 CP r2, r39
9: LD r2, r42, 0a, 8h CP r3, r40
LD r47, r42, 8a, 8h CP r4, r35
JNE r45, r44, :5
JEQ r47, r37, :6
CP r4, r39
MUL64 r3, r47, r4
JAL r31, r0, :free JAL r31, r0, :free
CP r1, r40 JMP :6
6: ST r34, r36, 0a, 8h
JMP :0
5: LD r39, r32, 0a, 1h
ST r39, r33, 0a, 1h
ADDI64 r33, r33, 1d
ADDI64 r32, r32, 1d
JMP :7 JMP :7
6: CP r1, r40 0: LD r32, r36, 8a, 8h
7: ST r1, r42, 0a, 8h LD r33, r36, 0a, 8h
JMP :8 ADD64 r33, r32, r33
5: CP r1, r40 ST r37, r33, 0a, 1h
CP r4, r39 ADD64 r32, r32, r35
ADDI64 r41, r46, 8d ST r32, r36, 8a, 8h
ADDI64 r48, r44, 8d CP r1, r33
LD r49, r44, 0a, 8h 4: LD r31, r254, 0a, 80h
ST r49, r46, 0a, 8h ADDI64 r254, r254, 80d
CP r44, r48
CP r46, r41
JMP :9
0: CP r42, r36
8: LD r50, r42, 8a, 8h
MULI64 r51, r50, 8d
LD r52, r42, 0a, 8h
ADD64 r1, r52, r51
CP r3, r32
ST r3, r1, 0a, 8h
LD r53, r42, 8a, 8h
ADD64 r54, r53, r33
ST r54, r42, 8a, 8h
4: LD r31, r254, 0a, 192h
ADDI64 r254, r254, 192d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 955 push:
ADDI64 r254, r254, -88d
ST r31, r254, 0a, 88h
CP r36, r2
CP r37, r3
LI64 r35, 1d
LD r33, r36, 8a, 8h
LD r32, r36, 16a, 8h
JNE r32, r33, :0
JNE r32, r0, :1
CP r32, r35
JMP :2
1: MULI64 r32, r32, 2d
2: LI64 r38, 8d
MUL64 r34, r32, r38
CP r2, r34
CP r3, r38
JAL r31, r0, :malloc
ST r32, r36, 16a, 8h
CP r34, r1
JNE r34, r0, :3
CP r1, r0
JMP :4
3: MULI64 r33, r33, 8d
LD r32, r36, 0a, 8h
ADD64 r39, r32, r33
CP r33, r34
7: LD r40, r36, 0a, 8h
LD r41, r36, 8a, 8h
JNE r39, r32, :5
JEQ r41, r0, :6
MUL64 r32, r41, r38
CP r2, r40
CP r3, r32
CP r4, r38
JAL r31, r0, :free
JMP :6
6: ST r34, r36, 0a, 8h
JMP :0
5: LD r40, r32, 0a, 8h
ST r40, r33, 0a, 8h
ADDI64 r33, r33, 8d
ADDI64 r32, r32, 8d
JMP :7
0: LD r32, r36, 8a, 8h
MULI64 r33, r32, 8d
LD r34, r36, 0a, 8h
ADD64 r33, r34, r33
ST r37, r33, 0a, 8h
ADD64 r32, r32, r35
ST r32, r36, 8a, 8h
CP r1, r33
4: LD r31, r254, 0a, 88h
ADDI64 r254, r254, 88d
JALA r0, r31, 0a
code size: 1623
ret: 69 ret: 69
status: Ok(()) status: Ok(())

View file

@ -1,7 +1,6 @@
clobber: clobber:
LRA r1, r0, :var LRA r13, r0, :var
LI64 r3, 0d ST r0, r13, 0a, 8h
ST r3, r1, 0a, 8h
JALA r0, r31, 0a JALA r0, r31, 0a
main: main:
ADDI64 r254, r254, -24d ADDI64 r254, r254, -24d
@ -10,10 +9,11 @@ main:
LI64 r33, 2d LI64 r33, 2d
ST r33, r32, 0a, 8h ST r33, r32, 0a, 8h
JAL r31, r0, :clobber JAL r31, r0, :clobber
LD r1, r32, 0a, 8h LD r32, r32, 0a, 8h
CP r1, r32
LD r31, r254, 0a, 24h LD r31, r254, 0a, 24h
ADDI64 r254, r254, 24d ADDI64 r254, r254, 24d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 166 code size: 159
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -0,0 +1,23 @@
inb:
CP r1, r0
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -24d
ST r31, r254, 0a, 24h
LRA r32, r0, :ports
LD r33, r32, 0a, 1h
ANDI r33, r33, 255d
JNE r33, r0, :0
JMP :1
0: JAL r31, r0, :inb
CP r33, r1
CMPU r33, r33, r0
CMPUI r33, r33, 0d
NOT r33, r33
ST r33, r32, 0a, 1h
1: LD r31, r254, 0a, 24h
ADDI64 r254, r254, 24d
JALA r0, r31, 0a
code size: 164
ret: 0
status: Ok(())

View file

@ -1,9 +1,10 @@
main: main:
LRA r2, r0, :complex_global_var LRA r13, r0, :complex_global_var
LD r3, r2, 0a, 8h LD r14, r13, 0a, 8h
ADDI64 r1, r3, 5d ADDI64 r14, r14, 5d
ST r1, r2, 0a, 8h ST r14, r13, 0a, 8h
CP r1, r14
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 71 code size: 74
ret: 55 ret: 55
status: Ok(()) status: Ok(())

View file

@ -1,6 +1,6 @@
main: main:
LI64 r1, 0d CP r1, r0
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 29 code size: 22
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -1,20 +1,20 @@
main: main:
ADDI64 r254, r254, -128d ADDI64 r254, r254, -128d
LI8 r5, 69b ADDI64 r14, r254, 0d
LI64 r6, 128d LI8 r15, 69b
LI64 r7, 0d LI64 r16, 128d
ADDI64 r4, r254, 0d CP r13, r0
2: LD r12, r254, 42a, 1h 2: LD r17, r254, 42a, 1h
JLTU r7, r6, :0 JLTU r13, r16, :0
ANDI r1, r12, 255d ANDI r13, r17, 255d
CP r1, r13
JMP :1 JMP :1
0: ADDI64 r3, r7, 1d 0: ADD64 r17, r14, r13
ADD64 r7, r4, r7 ST r15, r17, 0a, 1h
ST r5, r7, 0a, 1h ADDI64 r13, r13, 1d
CP r7, r3
JMP :2 JMP :2
1: ADDI64 r254, r254, 128d 1: ADDI64 r254, r254, 128d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 145 code size: 138
ret: 69 ret: 69
status: Ok(()) status: Ok(())

View file

@ -1,30 +1,36 @@
fib: fib:
ADDI64 r254, r254, -40d ADDI64 r254, r254, -32d
ST r31, r254, 0a, 40h ST r31, r254, 0a, 32h
LI64 r1, 1d CP r32, r2
LI64 r32, 2d LI64 r33, 1d
JGTU r2, r32, :0 LI64 r34, 2d
JGTU r32, r34, :0
CP r1, r33
JMP :1 JMP :1
0: CP r33, r2 0: SUB64 r33, r32, r33
SUB64 r2, r33, r1 CP r2, r33
CP r34, r33
JAL r31, r0, :fib JAL r31, r0, :fib
CP r2, r34 CP r33, r1
CP r35, r1 SUB64 r32, r32, r34
SUB64 r2, r2, r32 CP r2, r32
JAL r31, r0, :fib JAL r31, r0, :fib
ADD64 r1, r1, r35 CP r32, r1
1: LD r31, r254, 0a, 40h ADD64 r32, r32, r33
ADDI64 r254, r254, 40d CP r1, r32
1: LD r31, r254, 0a, 32h
ADDI64 r254, r254, 32d
JALA r0, r31, 0a JALA r0, r31, 0a
main: main:
ADDI64 r254, r254, -8d ADDI64 r254, r254, -16d
ST r31, r254, 0a, 8h ST r31, r254, 0a, 16h
LI64 r2, 10d LI64 r32, 10d
CP r2, r32
JAL r31, r0, :fib JAL r31, r0, :fib
LD r31, r254, 0a, 8h CP r32, r1
ADDI64 r254, r254, 8d CP r1, r32
LD r31, r254, 0a, 16h
ADDI64 r254, r254, 16d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 211 code size: 229
ret: 55 ret: 55
status: Ok(()) status: Ok(())

View file

@ -1,9 +1,9 @@
main: main:
LI64 r2, 0d CP r13, r0
0: ADDI64 r2, r2, 1d 0: ADDI64 r13, r13, 1d
JMP :0 JMP :0
JALA r0, r31, 0a JALA r0, r31, 0a
timed out timed out
code size: 45 code size: 38
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -1,17 +1,20 @@
main: main:
LI64 r7, 6d LI64 r13, 8d
LRA r3, r0, :gb CP r2, r13
LI64 r6, 0d ECA
LD r8, r3, 0a, 8h LI64 r14, 6d
CMPU r9, r8, r6 LRA r13, r0, :gb
CMPUI r9, r9, 0d LD r13, r13, 0a, 8h
ORI r11, r9, 0d CMPU r13, r13, r0
ANDI r11, r11, 255d CMPUI r13, r13, 0d
JNE r11, r0, :0 OR r13, r13, r0
CP r4, r7 ANDI r13, r13, 255d
JNE r13, r0, :0
CP r13, r14
JMP :1 JMP :1
0: LI64 r4, 1d 0: LI64 r13, 1d
1: SUB64 r1, r4, r7 1: SUB64 r13, r13, r14
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 131 code size: 131
ret: 0 ret: 0

View file

@ -0,0 +1,19 @@
main:
ADDI64 r254, r254, -72d
ADDI64 r13, r254, 24d
ST r0, r254, 24a, 8h
LI64 r14, 1d
ST r14, r254, 32a, 8h
LI64 r14, 2d
ST r14, r254, 40a, 8h
ADDI64 r14, r254, 0d
BMC r13, r14, 24h
ADDI64 r13, r254, 48d
BMC r14, r13, 24h
LD r13, r254, 48a, 8h
CP r1, r13
ADDI64 r254, r254, 72d
JALA r0, r31, 0a
code size: 159
ret: 0
status: Ok(())

View file

@ -1,39 +1,29 @@
main: main:
ADDI64 r254, r254, -32d ADDI64 r254, r254, -16d
ST r31, r254, 0a, 32h ST r31, r254, 0a, 16h
JAL r31, r0, :scalar_values JAL r31, r0, :scalar_values
LI64 r32, 0d CP r32, r1
CP r33, r32 JEQ r32, r0, :0
JEQ r1, r33, :0 LI64 r32, 1d
LI64 r1, 1d CP r1, r32
JMP :1 JMP :1
0: JAL r31, r0, :structs 0: JAL r31, r0, :structs
CP r34, r33 CP r32, r1
JEQ r1, r34, :2 JEQ r32, r0, :2
JAL r31, r0, :structs JAL r31, r0, :structs
CP r32, r1
CP r1, r32
JMP :1 JMP :1
2: CP r1, r34 2: CP r1, r0
CP r33, r34 1: LD r31, r254, 0a, 16h
1: LD r31, r254, 0a, 32h ADDI64 r254, r254, 16d
ADDI64 r254, r254, 32d
JALA r0, r31, 0a JALA r0, r31, 0a
scalar_values: scalar_values:
LI64 r1, 0d CP r1, r0
JALA r0, r31, 0a JALA r0, r31, 0a
structs: structs:
ADDI64 r254, r254, -32d CP r1, r0
LI64 r1, 5d
ST r1, r254, 16a, 8h
ST r1, r254, 24a, 8h
LD r5, r254, 16a, 8h
ADDI64 r7, r5, 15d
ST r7, r254, 0a, 8h
LI64 r10, 20d
ST r10, r254, 8a, 8h
LD r1, r254, 0a, 8h
SUB64 r1, r1, r10
ADDI64 r254, r254, 32d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 307 code size: 164
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -1,6 +1,7 @@
main: main:
LI64 r1, 10d LI64 r13, 10d
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 29 code size: 32
ret: 10 ret: 10
status: Ok(()) status: Ok(())

View file

@ -1,109 +1,101 @@
main: main:
ADDI64 r254, r254, -106d ADDI64 r254, r254, -98d
ST r31, r254, 58a, 48h ST r31, r254, 58a, 40h
ADDI64 r32, r254, 33d ADDI64 r32, r254, 33d
ADDI64 r2, r254, 34d ADDI64 r33, r254, 34d
ADDI64 r6, r254, 1d
LI64 r33, 0d
ADDI64 r4, r254, 17d
ST r32, r254, 34a, 8h ST r32, r254, 34a, 8h
LI64 r34, 100d LI64 r34, 100d
ADDI64 r7, r254, 0d
LI8 r35, 1b LI8 r35, 1b
ST r33, r254, 1a, 8h ST r0, r254, 1a, 8h
ST r33, r254, 17a, 8h ST r0, r254, 17a, 8h
ST r34, r254, 42a, 8h ST r34, r254, 42a, 8h
LI8 r36, 0b
ST r35, r254, 0a, 1h ST r35, r254, 0a, 1h
ST r33, r254, 9a, 8h ST r0, r254, 9a, 8h
ST r33, r254, 25a, 8h ST r0, r254, 25a, 8h
ST r34, r254, 50a, 8h ST r34, r254, 50a, 8h
ST r36, r254, 33a, 1h ST r0, r254, 33a, 1h
CP r3, r4 CP r2, r33
CP r5, r6 LD r3, r254, 17a, 16h
LD r3, r3, 0a, 16h LD r5, r254, 1a, 16h
LD r5, r5, 0a, 16h LD r7, r254, 0a, 1h
LD r7, r7, 0a, 1h
JAL r31, r0, :put_filled_rect JAL r31, r0, :put_filled_rect
LD r31, r254, 58a, 48h LD r31, r254, 58a, 40h
ADDI64 r254, r254, 106d ADDI64 r254, r254, 98d
JALA r0, r31, 0a JALA r0, r31, 0a
put_filled_rect: put_filled_rect:
ADDI64 r254, r254, -212d ADDI64 r254, r254, -108d
ST r32, r254, 108a, 104h CP r14, r2
ST r3, r254, 92a, 16h ST r3, r254, 92a, 16h
ADDI64 r3, r254, 92d ADDI64 r3, r254, 92d
CP r15, r3
ST r5, r254, 76a, 16h ST r5, r254, 76a, 16h
ADDI64 r5, r254, 76d ADDI64 r5, r254, 76d
CP r13, r5
ST r7, r254, 75a, 1h ST r7, r254, 75a, 1h
ADDI64 r7, r254, 75d ADDI64 r7, r254, 75d
LI64 r8, 25d CP r16, r7
LI64 r32, 2d ADDI64 r17, r254, 25d
LI64 r6, 8d LI8 r18, 5b
ADDI64 r33, r254, 25d ST r18, r254, 25a, 1h
ADDI64 r34, r254, 50d LD r19, r13, 0a, 8h
LI8 r35, 5b ST r19, r254, 26a, 4h
ST r35, r254, 25a, 1h LI64 r20, 1d
LD r36, r5, 0a, 8h ST r20, r254, 30a, 4h
ST r36, r254, 26a, 4h ST r16, r254, 34a, 8h
LI64 r37, 1d LI64 r21, 25d
ST r37, r254, 30a, 4h ADDI64 r22, r254, 50d
ST r7, r254, 34a, 8h ST r18, r254, 50a, 1h
ST r35, r254, 50a, 1h ST r19, r254, 51a, 4h
ST r36, r254, 51a, 4h ST r20, r254, 55a, 4h
ST r37, r254, 55a, 4h ST r16, r254, 59a, 8h
ST r7, r254, 59a, 8h LI64 r23, 2d
CP r38, r7 LI64 r24, 8d
LD r7, r3, 8a, 8h LD r25, r15, 8a, 8h
LD r39, r5, 8a, 8h LD r13, r13, 8a, 8h
ADD64 r11, r39, r7 ADD64 r26, r13, r25
SUB64 r4, r11, r37 SUB64 r26, r26, r20
LD r40, r2, 8a, 8h LD r27, r14, 8a, 8h
MUL64 r5, r40, r4 MUL64 r26, r27, r26
LD r9, r2, 0a, 8h LD r14, r14, 0a, 8h
ADD64 r10, r9, r5 ADD64 r26, r14, r26
LD r2, r3, 0a, 8h LD r28, r15, 0a, 8h
ADD64 r41, r2, r10 MUL64 r15, r27, r25
MUL64 r3, r40, r7 ADD64 r14, r14, r15
ADD64 r4, r9, r3 ADD64 r15, r28, r26
ADD64 r42, r2, r4 ADD64 r14, r28, r14
3: JGTU r39, r37, :0 3: JGTU r13, r20, :0
JNE r39, r37, :1 JNE r13, r20, :1
ADDI64 r4, r254, 0d ADDI64 r13, r254, 0d
ST r35, r254, 0a, 1h ST r18, r254, 0a, 1h
ST r36, r254, 1a, 4h ST r19, r254, 1a, 4h
ST r37, r254, 5a, 4h ST r20, r254, 5a, 4h
ST r38, r254, 9a, 8h ST r16, r254, 9a, 8h
ST r42, r254, 17a, 8h ST r14, r254, 17a, 8h
CP r2, r6 CP r2, r24
CP r3, r32 CP r3, r23
CP r5, r8 CP r4, r13
CP r5, r21
ECA ECA
JMP :1 JMP :1
1: JMP :2 1: JMP :2
0: CP r3, r32 0: ST r14, r254, 67a, 8h
CP r43, r6 CP r2, r24
CP r44, r8 CP r3, r23
ST r42, r254, 67a, 8h CP r4, r22
CP r2, r43 CP r5, r21
CP r4, r34
CP r5, r44
ECA ECA
ST r41, r254, 42a, 8h ST r15, r254, 42a, 8h
CP r2, r43 CP r2, r24
CP r3, r32 CP r3, r23
CP r4, r33 CP r4, r17
CP r5, r44 CP r5, r21
ECA ECA
ADD64 r42, r40, r42 SUB64 r13, r13, r23
SUB64 r41, r41, r40 SUB64 r15, r15, r27
SUB64 r39, r39, r32 ADD64 r14, r27, r14
CP r6, r43
CP r8, r44
JMP :3 JMP :3
2: LD r32, r254, 108a, 104h 2: ADDI64 r254, r254, 108d
ADDI64 r254, r254, 212d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 917 code size: 842
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -1,8 +0,0 @@
main:
LRA r2, r0, :x
LI64 r1, 0d
ST r1, r2, 0a, 8h
JALA r0, r31, 0a
code size: 57
ret: 0
status: Ok(())

View file

@ -1,20 +1,25 @@
main: main:
ADDI64 r254, r254, -32d ADDI64 r254, r254, -48d
ST r31, r254, 16a, 16h ST r31, r254, 16a, 32h
ADDI64 r3, r254, 0d ADDI64 r32, r254, 0d
ADDI64 r2, r254, 8d ADDI64 r33, r254, 8d
LI64 r32, 0d ST r0, r254, 0a, 8h
ST r32, r254, 0a, 8h ST r0, r254, 8a, 8h
ST r32, r254, 8a, 8h LI64 r34, 1024d
LI64 r4, 1024d CP r2, r33
CP r3, r32
CP r4, r34
JAL r31, r0, :set JAL r31, r0, :set
ANDI r1, r1, 4294967295d CP r32, r1
LD r31, r254, 16a, 16h ANDI r32, r32, 4294967295d
ADDI64 r254, r254, 32d CP r1, r32
LD r31, r254, 16a, 32h
ADDI64 r254, r254, 48d
JALA r0, r31, 0a JALA r0, r31, 0a
set: set:
CP r1, r4 CP r13, r4
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 167 code size: 175
ret: 1024 ret: 1024
status: Ok(()) status: Ok(())

View file

@ -1,29 +1,28 @@
integer_range: integer_range:
ADDI64 r254, r254, -16d CP r13, r2
ST r32, r254, 0a, 16h CP r14, r3
CP r32, r2 LI64 r15, 4d
CP r33, r3 LI64 r16, 3d
LI64 r3, 4d CP r2, r16
LI64 r2, 3d CP r3, r15
ECA ECA
CP r2, r32 SUB64 r14, r14, r13
CP r3, r33 ADDI64 r14, r14, 1d
SUB64 r11, r3, r2 CP r15, r1
ADDI64 r3, r11, 1d DIRU64 r0, r14, r15, r14
DIRU64 r0, r3, r1, r3 ADD64 r13, r14, r13
ADD64 r1, r3, r2 CP r1, r13
LD r32, r254, 0a, 16h
ADDI64 r254, r254, 16d
JALA r0, r31, 0a JALA r0, r31, 0a
main: main:
ADDI64 r254, r254, -8d ADDI64 r254, r254, -16d
ST r31, r254, 0a, 8h ST r31, r254, 0a, 16h
LI64 r3, 1000d LI64 r32, 1000d
LI64 r2, 0d CP r2, r0
CP r3, r32
JAL r31, r0, :integer_range JAL r31, r0, :integer_range
LD r31, r254, 0a, 8h LD r31, r254, 0a, 16h
ADDI64 r254, r254, 8d ADDI64 r254, r254, 16d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 210 code size: 164
ret: 42 ret: 42
status: Ok(()) status: Ok(())

View file

@ -0,0 +1,49 @@
chars:
ADDI64 r254, r254, -32d
ST r3, r254, 16a, 16h
ADDI64 r3, r254, 16d
CP r13, r3
ADDI64 r14, r254, 0d
BMC r13, r14, 16h
LD r1, r14, 0a, 16h
ADDI64 r254, r254, 32d
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -56d
ST r31, r254, 32a, 24h
LRA r32, r0, :Hello, World!
ST r32, r254, 16a, 8h
LI64 r32, 13d
ST r32, r254, 24a, 8h
ADDI64 r32, r254, 0d
LD r3, r254, 16a, 16h
JAL r31, r0, :chars
ST r1, r32, 0a, 16h
2: CP r2, r32
JAL r31, r0, :next
CP r33, r1
ANDI r33, r33, 65535d
JNE r33, r0, :0
JMP :1
0: JMP :2
1: LD r31, r254, 32a, 24h
ADDI64 r254, r254, 56d
JALA r0, r31, 0a
next:
CP r13, r2
LD r14, r13, 8a, 8h
JNE r14, r0, :0
CP r1, r0
JMP :1
0: LD r15, r13, 0a, 8h
ADDI64 r15, r15, 1d
ST r15, r13, 0a, 8h
ADDI64 r14, r14, -1d
LD r15, r15, 0a, 1h
ST r14, r13, 8a, 8h
ORI r13, r15, 32768d
CP r1, r13
1: JALA r0, r31, 0a
code size: 423
ret: 0
status: Ok(())

View file

@ -1,16 +1,16 @@
main: main:
ADDI64 r254, r254, -8d ADDI64 r254, r254, -8d
LI64 r3, 0d LI64 r13, 10d
LI64 r2, 10d ST r13, r254, 0a, 8h
ST r2, r254, 0a, 8h 2: LD r13, r254, 0a, 8h
2: LD r1, r254, 0a, 8h JNE r13, r0, :0
JNE r1, r3, :0 CP r1, r13
JMP :1 JMP :1
0: ADDI64 r11, r1, -1d 0: ADDI64 r13, r13, -1d
ST r11, r254, 0a, 8h ST r13, r254, 0a, 8h
JMP :2 JMP :2
1: ADDI64 r254, r254, 8d 1: ADDI64 r254, r254, 8d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 126 code size: 119
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -1,25 +1,28 @@
fib: fib:
LI64 r4, 1d CP r13, r2
LI64 r5, 0d LI64 r17, 1d
CP r1, r5 CP r15, r0
CP r10, r4 CP r14, r15
2: JNE r2, r5, :0 CP r16, r17
2: JNE r13, r15, :0
CP r1, r14
JMP :1 JMP :1
0: ADD64 r1, r10, r1 0: SUB64 r13, r13, r17
SUB64 r2, r2, r4 ADD64 r14, r16, r14
CP r3, r1 SWA r14, r16
CP r1, r10
CP r10, r3
JMP :2 JMP :2
1: JALA r0, r31, 0a 1: JALA r0, r31, 0a
main: main:
ADDI64 r254, r254, -8d ADDI64 r254, r254, -16d
ST r31, r254, 0a, 8h ST r31, r254, 0a, 16h
LI64 r2, 10d LI64 r32, 10d
CP r2, r32
JAL r31, r0, :fib JAL r31, r0, :fib
LD r31, r254, 0a, 8h CP r32, r1
ADDI64 r254, r254, 8d CP r1, r32
LD r31, r254, 0a, 16h
ADDI64 r254, r254, 16d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 153 code size: 155
ret: 55 ret: 55
status: Ok(()) status: Ok(())

View file

@ -1,6 +1,7 @@
main: main:
LI64 r1, 1d LI64 r13, 1d
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 29 code size: 32
ret: 1 ret: 1
status: Ok(()) status: Ok(())

View file

@ -0,0 +1,36 @@
decide:
ADDI64 r254, r254, -24d
CP r14, r2
CP r15, r1
ADDI64 r13, r254, 0d
ST r14, r254, 0a, 8h
ST r0, r254, 8a, 8h
ST r0, r254, 16a, 8h
BMC r13, r15, 24h
ADDI64 r254, r254, 24d
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -104d
ST r31, r254, 72a, 32h
ADDI64 r32, r254, 48d
CP r1, r32
CP r2, r0
JAL r31, r0, :decide
ADDI64 r33, r254, 24d
BMC r32, r33, 24h
LI64 r34, 1d
CP r1, r33
CP r2, r34
JAL r31, r0, :decide
ADDI64 r34, r254, 0d
BMC r32, r34, 24h
LD r32, r254, 24a, 8h
LD r33, r254, 0a, 8h
ADD64 r32, r33, r32
CP r1, r32
LD r31, r254, 72a, 32h
ADDI64 r254, r254, 104d
JALA r0, r31, 0a
code size: 273
ret: 1
status: Ok(())

View file

@ -0,0 +1,59 @@
main:
ADDI64 r254, r254, -72d
ST r31, r254, 32a, 40h
LRA r32, r0, :"Goodbye, World!\0"
LRA r33, r0, :"Hello, World!\0"
ST r32, r254, 16a, 8h
ST r33, r254, 24a, 8h
LD r2, r254, 24a, 8h
LD r3, r254, 16a, 8h
JAL r31, r0, :print
ADDI64 r34, r254, 8d
ADDI64 r35, r254, 0d
ST r32, r254, 8a, 8h
ST r33, r254, 0a, 8h
CP r2, r35
CP r3, r34
JAL r31, r0, :print2
LD r31, r254, 32a, 40h
ADDI64 r254, r254, 72d
JALA r0, r31, 0a
print:
ADDI64 r254, r254, -16d
ST r2, r254, 8a, 8h
ADDI64 r2, r254, 8d
CP r13, r2
ST r3, r254, 0a, 8h
ADDI64 r3, r254, 0d
CP r14, r3
LD r13, r13, 0a, 8h
LI64 r15, 37d
CP r2, r15
CP r3, r13
ECA
LD r13, r14, 0a, 8h
CP r2, r15
CP r3, r13
ECA
ADDI64 r254, r254, 16d
JALA r0, r31, 0a
print2:
CP r13, r2
CP r14, r3
LD r13, r13, 0a, 8h
LI64 r15, 37d
CP r2, r15
CP r3, r13
ECA
LD r13, r14, 0a, 8h
CP r2, r15
CP r3, r13
ECA
JALA r0, r31, 0a
Hello, World!
Goodbye, World!
Hello, World!
Goodbye, World!
code size: 435
ret: 0
status: Ok(())

View file

@ -1,27 +1,23 @@
main: main:
ADDI64 r254, r254, -32d ADDI64 r254, r254, -24d
ST r31, r254, 0a, 32h ST r31, r254, 0a, 24h
JAL r31, r0, :opaque JAL r31, r0, :opaque
CP r32, r1 CP r33, r1
JAL r31, r0, :opaque JAL r31, r0, :opaque
LI64 r33, 0d JNE r33, r0, :0
CP r1, r32 CP r32, r0
JNE r1, r33, :0
CP r32, r1
LI64 r1, 0d
CP r34, r32
JMP :1 JMP :1
0: CP r34, r1 0: LD r32, r33, 0a, 8h
LD r1, r34, 0a, 8h 1: JEQ r33, r0, :2
1: JEQ r34, r33, :2 LD r32, r33, 0a, 8h
LD r1, r34, 0a, 8h
JMP :2 JMP :2
2: LD r31, r254, 0a, 32h 2: CP r1, r32
ADDI64 r254, r254, 32d LD r31, r254, 0a, 24h
ADDI64 r254, r254, 24d
JALA r0, r31, 0a JALA r0, r31, 0a
opaque: opaque:
LI64 r1, 0d CP r1, r0
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 183 code size: 150
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -1,6 +1,6 @@
test.hb:4:17: unwrap is not needed since the value is (provably) never null, remove it, or replace with '@as(<expr_ty>, <opt_expr>)' test.hb:4:18: unwrap is not needed since the value is (provably) never null, remove it, or replace with '@as(<expr_ty>, <opt_expr>)'
ptr := @unwrap(always_nn) ptr1 := @unwrap(always_nn)
^ ^
test.hb:6:16: unwrap is incorrect since the value is (provably) always null, make sure your logic is correct test.hb:6:18: unwrap is incorrect since the value is (provably) always null, make sure your logic is correct
ptr = @unwrap(always_n) ptr2 := @unwrap(always_n)
^ ^

View file

@ -0,0 +1,31 @@
main:
ADDI64 r254, r254, -30d
ST r31, r254, 6a, 24h
ADDI64 r32, r254, 0d
2: JAL r31, r0, :return_fn
ST r1, r32, 0a, 6h
LD r33, r254, 0a, 1h
ANDI r33, r33, 255d
JEQ r33, r0, :0
LI64 r32, 1d
CP r1, r32
JMP :1
0: JMP :2
1: LD r31, r254, 6a, 24h
ADDI64 r254, r254, 30d
JALA r0, r31, 0a
return_fn:
ADDI64 r254, r254, -6d
LI8 r13, 1b
ST r13, r254, 0a, 1h
ST r0, r254, 1a, 1h
ST r0, r254, 2a, 1h
ST r0, r254, 3a, 1h
ST r0, r254, 4a, 1h
ST r0, r254, 5a, 1h
LD r1, r254, 0a, 6h
ADDI64 r254, r254, 6d
JALA r0, r31, 0a
code size: 277
ret: 1
status: Ok(())

View file

@ -0,0 +1,72 @@
foo:
ADDI64 r254, r254, -112d
ST r31, r254, 80a, 32h
ADDI64 r32, r254, 64d
LRA r33, r0, :some_file
CP r3, r33
JAL r31, r0, :get
ST r1, r32, 0a, 16h
LD r33, r254, 64a, 1h
ANDI r33, r33, 255d
JNE r33, r0, :0
ST r0, r254, 48a, 1h
LD r1, r254, 48a, 16h
JMP :1
0: LI8 r33, 1b
LI64 r34, 4d
LD r32, r254, 72a, 8h
JNE r32, r34, :2
ST r33, r254, 32a, 1h
LI64 r32, 2d
ST r32, r254, 40a, 8h
LD r1, r254, 32a, 16h
JMP :1
2: LRA r34, r0, :magic
LD r34, r34, 0a, 8h
JNE r34, r32, :3
ST r33, r254, 16a, 1h
ST r0, r254, 24a, 8h
LD r1, r254, 16a, 16h
JMP :1
3: ST r0, r254, 0a, 1h
LD r1, r254, 0a, 16h
1: LD r31, r254, 80a, 32h
ADDI64 r254, r254, 112d
JALA r0, r31, 0a
get:
ADDI64 r254, r254, -32d
CP r13, r3
LD r13, r13, 0a, 1h
LRA r14, r0, :magic
ANDI r13, r13, 255d
LD r14, r14, 0a, 8h
JNE r14, r13, :0
LI8 r13, 1b
ST r13, r254, 16a, 1h
ST r14, r254, 24a, 8h
LD r1, r254, 16a, 16h
JMP :1
0: ST r0, r254, 0a, 1h
LD r1, r254, 0a, 16h
1: ADDI64 r254, r254, 32d
JALA r0, r31, 0a
main:
ADDI64 r254, r254, -40d
ST r31, r254, 16a, 24h
ADDI64 r32, r254, 0d
JAL r31, r0, :foo
ST r1, r32, 0a, 16h
LD r33, r254, 0a, 1h
ANDI r33, r33, 255d
JNE r33, r0, :0
LI64 r32, 100d
CP r1, r32
JMP :1
0: LD r32, r254, 8a, 8h
CP r1, r32
1: LD r31, r254, 16a, 24h
ADDI64 r254, r254, 40d
JALA r0, r31, 0a
code size: 673
ret: 0
status: Ok(())

View file

@ -1,26 +1,24 @@
get_ptr: get_ptr:
ADDI64 r254, r254, -8d CP r1, r0
ADDI64 r1, r254, 0d
ADDI64 r254, r254, 8d
JALA r0, r31, 0a JALA r0, r31, 0a
main: main:
ADDI64 r254, r254, -40d ADDI64 r254, r254, -32d
ST r31, r254, 0a, 40h ST r31, r254, 0a, 32h
JAL r31, r0, :get_ptr JAL r31, r0, :get_ptr
LI64 r32, 0d CP r32, r1
JNE r1, r32, :0 JNE r32, r0, :0
LI64 r1, 0d CP r1, r0
JMP :1 JMP :1
0: LI64 r33, 10d 0: LI64 r33, 10d
CP r34, r1 3: LD r34, r32, 0a, 8h
2: LD r1, r34, 0a, 8h JEQ r34, r33, :2
JEQ r1, r33, :1 ADDI64 r34, r34, 1d
ADDI64 r35, r1, 1d ST r34, r32, 0a, 8h
ST r35, r34, 0a, 8h JMP :3
JMP :2 2: CP r1, r34
1: LD r31, r254, 0a, 40h 1: LD r31, r254, 0a, 32h
ADDI64 r254, r254, 40d ADDI64 r254, r254, 32d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 208 code size: 164
ret: 10 ret: 0
status: Ok(()) status: Ok(())

View file

@ -1,65 +1,57 @@
main: main:
ADDI64 r254, r254, -122d ADDI64 r254, r254, -58d
ST r31, r254, 26a, 96h ST r31, r254, 26a, 32h
JAL r31, r0, :returner_fn JAL r31, r0, :returner_fn
CP r32, r1 CP r32, r1
ADDI64 r1, r254, 2d ADDI64 r33, r254, 2d
CP r1, r33
JAL r31, r0, :returner_bn JAL r31, r0, :returner_bn
ADDI64 r33, r254, 0d ADDI64 r34, r254, 0d
JAL r31, r0, :returner_cn JAL r31, r0, :returner_cn
ST r1, r254, 0a, 2h ST r1, r34, 0a, 2h
LI8 r34, 0b LD r33, r254, 2a, 1h
LI8 r35, 0b CMPU r32, r32, r0
LD r36, r254, 2a, 1h CMPUI r32, r32, 0d
CP r1, r32 CMPU r33, r33, r0
ANDI r37, r37, 255d CMPUI r33, r33, 0d
ANDI r1, r1, 255d LD r34, r254, 0a, 1h
CMPU r37, r1, r34 AND r32, r33, r32
CMPUI r37, r37, 0d CMPU r33, r34, r0
ANDI r38, r38, 255d CMPUI r33, r33, 0d
ANDI r36, r36, 255d AND r32, r33, r32
CMPU r38, r36, r35 ANDI r32, r32, 255d
CMPUI r38, r38, 0d JNE r32, r0, :0
LD r39, r254, 0a, 1h CP r1, r0
AND r40, r38, r37
ANDI r41, r41, 255d
ANDI r39, r39, 255d
CMPU r41, r39, r35
CMPUI r41, r41, 0d
AND r42, r41, r40
ANDI r42, r42, 255d
JNE r42, r0, :0
LI64 r1, 0d
JMP :1 JMP :1
0: LI64 r1, 1d 0: LI64 r32, 1d
1: LD r31, r254, 26a, 96h CP r1, r32
ADDI64 r254, r254, 122d 1: LD r31, r254, 26a, 32h
ADDI64 r254, r254, 58d
JALA r0, r31, 0a JALA r0, r31, 0a
returner_bn: returner_bn:
ADDI64 r254, r254, -24d ADDI64 r254, r254, -24d
LI8 r6, 1b CP r15, r1
ADDI64 r5, r254, 0d LI8 r14, 1b
ST r6, r254, 0a, 1h ADDI64 r13, r254, 0d
LI64 r6, 0d ST r14, r254, 0a, 1h
ST r6, r254, 8a, 8h ST r0, r254, 8a, 8h
ST r6, r254, 16a, 8h ST r0, r254, 16a, 8h
BMC r5, r1, 24h BMC r13, r15, 24h
ADDI64 r254, r254, 24d ADDI64 r254, r254, 24d
JALA r0, r31, 0a JALA r0, r31, 0a
returner_cn: returner_cn:
ADDI64 r254, r254, -2d ADDI64 r254, r254, -2d
LI8 r4, 1b LI8 r13, 1b
ADDI64 r3, r254, 0d ST r13, r254, 0a, 1h
ST r4, r254, 0a, 1h ST r0, r254, 1a, 1h
LI8 r4, 0b LD r1, r254, 0a, 2h
ST r4, r254, 1a, 1h
LD r1, r3, 0a, 2h
ADDI64 r254, r254, 2d ADDI64 r254, r254, 2d
JALA r0, r31, 0a JALA r0, r31, 0a
returner_fn: returner_fn:
LD r1, r254, 0a, 0h LD r13, r254, 0a, 0h
ORI r1, r1, 128d ORI r13, r13, 128d
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 546 code size: 452
ret: 1 ret: 1
status: Ok(()) status: Ok(())

View file

@ -1,135 +1,144 @@
decide: decide:
LI8 r1, 1b LI8 r13, 1b
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
main: main:
ADDI64 r254, r254, -224d ADDI64 r254, r254, -136d
ST r31, r254, 80a, 144h ST r31, r254, 96a, 40h
JAL r31, r0, :decide JAL r31, r0, :decide
LI64 r32, 0d CP r33, r0
ADDI64 r2, r254, 72d ADDI64 r34, r254, 88d
CP r33, r2 CP r32, r1
ANDI r1, r1, 255d ANDI r32, r32, 255d
JNE r1, r0, :0 JNE r32, r0, :0
CP r34, r32 CP r32, r33
JMP :1 JMP :1
0: CP r34, r33 0: CP r32, r34
1: JNE r34, r32, :2 1: JNE r32, r33, :2
LI64 r1, 9001d LI64 r32, 9001d
CP r1, r32
JMP :3 JMP :3
2: JAL r31, r0, :decide 2: JAL r31, r0, :decide
LI8 r35, 0b CP r33, r1
ANDI r1, r1, 255d ANDI r33, r33, 255d
JNE r1, r0, :4 JNE r33, r0, :4
LI8 r36, 1b LI8 r33, 1b
ST r36, r254, 56a, 1h ST r33, r254, 72a, 1h
LD r36, r34, 0a, 8h LD r32, r32, 0a, 8h
ST r36, r254, 64a, 8h ST r32, r254, 80a, 8h
JMP :5 JMP :5
4: ST r35, r254, 56a, 1h 4: ST r0, r254, 72a, 1h
5: LD r37, r254, 56a, 1h 5: LD r32, r254, 72a, 1h
ANDI r37, r37, 255d ANDI r32, r32, 255d
ANDI r35, r35, 255d JEQ r32, r0, :6
JEQ r37, r35, :6 LI64 r32, 42d
LI64 r1, 42d CP r1, r32
JMP :3 JMP :3
6: JAL r31, r0, :decide 6: JAL r31, r0, :decide
LI32 r38, 0w CP r33, r0
ANDI r1, r1, 255d CP r32, r1
JNE r1, r0, :7 ANDI r32, r32, 255d
CP r39, r38 JNE r32, r0, :7
CP r32, r33
JMP :8 JMP :8
7: LI32 r39, 2147483649w 7: LI32 r32, 2147483649w
8: ANDI r39, r39, 4294967295d 8: ANDI r32, r32, 4294967295d
ANDI r38, r38, 4294967295d ANDI r33, r33, 4294967295d
JNE r39, r38, :9 JNE r32, r33, :9
LI64 r1, 69d LI64 r32, 69d
CP r1, r32
JMP :3 JMP :3
9: ADDI64 r3, r254, 40d 9: ADDI64 r33, r254, 56d
CP r40, r3
JAL r31, r0, :new_foo JAL r31, r0, :new_foo
ST r1, r254, 40a, 16h ST r1, r33, 0a, 16h
LI64 r32, 0d LD r35, r254, 56a, 8h
LD r41, r254, 40a, 8h JNE r35, r0, :10
JNE r41, r32, :10 LI64 r32, 999d
LI64 r1, 999d CP r1, r32
JMP :3 JMP :3
10: LRA r4, r0, :"foo\0" 10: LRA r35, r0, :"foo\0"
CP r3, r40 ST r35, r254, 40a, 8h
CP r2, r3 LI64 r35, 4d
LD r2, r2, 0a, 16h ST r35, r254, 48a, 8h
LD r2, r33, 0a, 16h
LD r4, r254, 40a, 16h
JAL r31, r0, :use_foo JAL r31, r0, :use_foo
ADDI64 r42, r254, 0d ADDI64 r33, r254, 0d
JAL r31, r0, :no_foo JAL r31, r0, :no_foo
ST r1, r254, 0a, 16h ST r1, r33, 0a, 16h
JAL r31, r0, :decide JAL r31, r0, :decide
ANDI r1, r1, 255d CP r35, r1
JNE r1, r0, :11 ANDI r35, r35, 255d
CP r2, r33 JNE r35, r0, :11
JMP :12 JMP :12
11: CP r2, r33 11: ST r34, r254, 0a, 8h
ST r2, r254, 0a, 8h LI64 r35, 1d
LI64 r43, 1d ST r35, r254, 8a, 8h
ST r43, r254, 8a, 8h ST r35, r254, 88a, 8h
ST r43, r254, 72a, 8h 12: LD r35, r254, 0a, 8h
12: LD r44, r254, 0a, 8h JNE r35, r0, :13
JNE r44, r32, :13 LI64 r32, 34d
LI64 r1, 34d CP r1, r32
JMP :3 JMP :3
13: ADDI64 r1, r254, 16d 13: ADDI64 r35, r254, 16d
CP r1, r35
CP r2, r34
JAL r31, r0, :new_bar JAL r31, r0, :new_bar
JAL r31, r0, :decide JAL r31, r0, :decide
ANDI r1, r1, 255d CP r34, r1
JNE r1, r0, :14 ANDI r34, r34, 255d
JNE r34, r0, :14
JMP :15 JMP :15
14: ST r35, r254, 16a, 1h 14: ST r0, r254, 16a, 1h
15: LD r45, r254, 16a, 1h 15: LD r34, r254, 16a, 1h
ANDI r45, r45, 255d ANDI r34, r34, 255d
ANDI r35, r35, 255d JEQ r34, r0, :16
JEQ r45, r35, :16 LI64 r32, 420d
LI64 r1, 420d CP r1, r32
JMP :3 JMP :3
16: LD r46, r254, 0a, 8h 16: LD r33, r254, 0a, 8h
LD r47, r46, 0a, 8h LD r33, r33, 0a, 8h
ANDI r48, r39, 65535d ANDI r32, r32, 65535d
SUB64 r1, r48, r47 SUB64 r32, r32, r33
3: LD r31, r254, 80a, 144h CP r1, r32
ADDI64 r254, r254, 224d 3: LD r31, r254, 96a, 40h
ADDI64 r254, r254, 136d
JALA r0, r31, 0a JALA r0, r31, 0a
new_bar: new_bar:
ADDI64 r254, r254, -24d ADDI64 r254, r254, -24d
LI8 r8, 1b CP r14, r2
ADDI64 r7, r254, 0d CP r16, r1
ST r8, r254, 0a, 1h LI8 r15, 1b
ST r2, r254, 8a, 8h ADDI64 r13, r254, 0d
LI64 r9, 1d ST r15, r254, 0a, 1h
ST r9, r254, 16a, 8h ST r14, r254, 8a, 8h
BMC r7, r1, 24h LI64 r14, 1d
ST r14, r254, 16a, 8h
BMC r13, r16, 24h
ADDI64 r254, r254, 24d ADDI64 r254, r254, 24d
JALA r0, r31, 0a JALA r0, r31, 0a
new_foo: new_foo:
ADDI64 r254, r254, -24d ADDI64 r254, r254, -24d
ADDI64 r3, r254, 0d ADDI64 r13, r254, 0d
ADDI64 r2, r254, 8d ST r13, r254, 8a, 8h
ST r3, r254, 8a, 8h ST r0, r254, 16a, 8h
LI64 r5, 0d LD r1, r254, 8a, 16h
ST r5, r254, 16a, 8h
LD r1, r2, 0a, 16h
ADDI64 r254, r254, 24d ADDI64 r254, r254, 24d
JALA r0, r31, 0a JALA r0, r31, 0a
no_foo: no_foo:
ADDI64 r254, r254, -16d ADDI64 r254, r254, -16d
ADDI64 r1, r254, 0d ST r0, r254, 0a, 8h
LI64 r3, 0d LD r1, r254, 0a, 16h
ST r3, r254, 0a, 8h
LD r1, r1, 0a, 16h
ADDI64 r254, r254, 16d ADDI64 r254, r254, 16d
JALA r0, r31, 0a JALA r0, r31, 0a
use_foo: use_foo:
ADDI64 r254, r254, -16d ADDI64 r254, r254, -32d
ST r2, r254, 0a, 16h ST r2, r254, 16a, 16h
ADDI64 r2, r254, 0d ADDI64 r2, r254, 16d
ADDI64 r254, r254, 16d ST r4, r254, 0a, 16h
ADDI64 r4, r254, 0d
ADDI64 r254, r254, 32d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 1143 code size: 1162
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -1,30 +1,34 @@
inb: inb:
CP r1, r2 CP r13, r2
CP r1, r13
JALA r0, r31, 0a JALA r0, r31, 0a
main: main:
ADDI64 r254, r254, -32d ADDI64 r254, r254, -24d
ST r31, r254, 0a, 32h ST r31, r254, 0a, 24h
LI64 r32, 0d LI64 r32, 100d
LI64 r33, 100d 4: CP r2, r32
4: CP r2, r33
JAL r31, r0, :inb JAL r31, r0, :inb
ANDI r34, r1, 2d CP r33, r1
JNE r34, r32, :0 ANDI r33, r33, 2d
LI64 r2, 96d JNE r33, r0, :0
CP r3, r32 LI64 r33, 96d
CP r2, r33
CP r3, r0
JAL r31, r0, :outb JAL r31, r0, :outb
3: CP r2, r33 3: CP r2, r32
JAL r31, r0, :inb JAL r31, r0, :inb
JEQ r1, r32, :1 CP r33, r1
LI64 r1, 1d JEQ r33, r0, :1
LI64 r32, 1d
CP r1, r32
JMP :2 JMP :2
1: JMP :3 1: JMP :3
0: JMP :4 0: JMP :4
2: LD r31, r254, 0a, 32h 2: LD r31, r254, 0a, 24h
ADDI64 r254, r254, 32d ADDI64 r254, r254, 24d
JALA r0, r31, 0a JALA r0, r31, 0a
outb: outb:
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 198 code size: 203
ret: 1 ret: 1
status: Ok(()) status: Ok(())

View file

@ -1,22 +1,21 @@
main: main:
ADDI64 r254, r254, -16d ADDI64 r254, r254, -16d
ADDI64 r3, r254, 0d ADDI64 r13, r254, 0d
LI64 r6, 0d CP r3, r0
CP r3, r6 CP r4, r0
CP r4, r6 CP r5, r0
CP r5, r6 CP r6, r0
ECA ECA
ST r1, r254, 0a, 16h ST r1, r13, 0a, 16h
LI8 r8, 0b LD r14, r254, 0a, 1h
LD r9, r254, 0a, 1h ANDI r14, r14, 255d
ANDI r9, r9, 255d JNE r14, r0, :0
ANDI r8, r8, 255d
JNE r9, r8, :0
UN UN
0: LD r1, r254, 8a, 8h 0: LD r13, r254, 8a, 8h
CP r1, r13
ADDI64 r254, r254, 16d ADDI64 r254, r254, 16d
JALA r0, r31, 0a JALA r0, r31, 0a
unknown ecall: 0 unknown ecall: 0
code size: 142 code size: 124
ret: 0 ret: 0
status: Err(Unreachable) status: Err(Unreachable)

View file

@ -1,34 +1,32 @@
main: main:
ADDI64 r254, r254, -104d ADDI64 r254, r254, -56d
ST r31, r254, 40a, 64h ST r31, r254, 24a, 32h
LI64 r32, 4d ADDI64 r32, r254, 0d
ADDI64 r33, r254, 24d LI64 r33, 1d
ADDI64 r34, r254, 0d ST r33, r254, 16a, 8h
ST r32, r254, 24a, 8h LI64 r34, 4d
LI64 r35, 1d ST r34, r254, 0a, 8h
ST r35, r254, 32a, 8h ST r33, r254, 8a, 8h
ST r35, r254, 16a, 8h
BMC r33, r34, 16h
JAL r31, r0, :opaque JAL r31, r0, :opaque
ST r1, r254, 0a, 16h ST r1, r32, 0a, 16h
LD r36, r254, 8a, 8h LD r33, r254, 8a, 8h
LD r37, r254, 16a, 8h LD r34, r254, 16a, 8h
ADD64 r38, r37, r36 ADD64 r33, r34, r33
LD r37, r254, 0a, 8h LD r32, r254, 0a, 8h
SUB64 r1, r37, r38 SUB64 r32, r32, r33
LD r31, r254, 40a, 64h CP r1, r32
ADDI64 r254, r254, 104d LD r31, r254, 24a, 32h
ADDI64 r254, r254, 56d
JALA r0, r31, 0a JALA r0, r31, 0a
opaque: opaque:
ADDI64 r254, r254, -16d ADDI64 r254, r254, -16d
LI64 r3, 3d LI64 r13, 3d
ADDI64 r2, r254, 0d ST r13, r254, 0a, 8h
ST r3, r254, 0a, 8h LI64 r13, 2d
LI64 r6, 2d ST r13, r254, 8a, 8h
ST r6, r254, 8a, 8h LD r1, r254, 0a, 16h
LD r1, r2, 0a, 16h
ADDI64 r254, r254, 16d ADDI64 r254, r254, 16d
JALA r0, r31, 0a JALA r0, r31, 0a
code size: 323 code size: 299
ret: 0 ret: 0
status: Ok(()) status: Ok(())

View file

@ -0,0 +1,7 @@
main:
LI64 r13, 10d
CP r1, r13
JALA r0, r31, 0a
code size: 32
ret: 10
status: Ok(())

Some files were not shown because too many files have changed in this diff Show more