import {sum} from './sum.js' with {type: 'comptime'};
is an unfortunate abuse of the `type` import attribute. `type` is the one spec-defined attribute and it's supposed to correspond to the mime-type of the imported module, thus the two web platform supported types are "json" and "css". The mime-type of the imported file in this case is still `application/javascript`, so if this module had a type it would be "js".
It would have been better to choose a different import attribute altogether.
You’re projecting the mimetype idea from two examples but the proposal is intentionally agnostic about what type might be used for:
> This proposal does not specify behavior for any particular attribute key or value. The JSON modules proposal will specify that type: "json" must be interpreted as a JSON module, and will specify common semantics for doing so. It is expected the type attribute will be leveraged to support additional module types in future TC39 proposals as well as by hosts.
Maybe the (relative) lack of ecosystem has kept you away, but I really recommend checking out both Dioxus and Leptos. Leptos is incredibly similar to React, but with Rust ergonomics, and it's been a pleasure to learn and use. With an LLM by my side that knows React and Rust pretty well, I've found myself not even needing the React libraries that I thought I would, since I can easily build on the fly the features/components I actually need.
I too, eventually gave up on React <> WASM <> Rust but I was able to port all my existing React over into Leptos in a few hours.
Yeah they are great, it's more the poor integration and lack of parallelism that makes it not worthwhile.
Thunking everything through JavaScript and not being able to take advantage of fearless concurrency severely restrict the use-cases. May as well just use TypeScript and React at that point
The bun and other authors would probably do well to not repurpose already understood terminology. "Macros" are already understood to be code that produces other code. "Comptime" is a nice alternative, but bun's "macros" aren't macros in that sense.
We had sweet-js macros as a library many years ago but it looks like it went nowhere, especially after an incompatible rewrite that (afaik) remains broken for even basic cases. (Caveat: been a while since I looked at it)
That particular example is odd. What are you gaining by having a macro that needs a compile step vs no macro and just configuring your compile step to use a JSX loader for js files?
The general idea is something like prebaking computation into your deployed JS/TS. This is much more general than JSX-related tools, and a lot cheaper to run. In JS applications I often find myself doing various small bits of work on startup, comptime.ts would move all these bits into build-time.
There are quite a lot of valid use cases to being able to transform arbitrary tokens into JavaScript at "compile" time. One that already exists is JSX, which is a macro that is baked into the TypeScript compiler but is restricted/tailored to React-style libraries.
A generic macro system could open the door to a framework like Svelte, Angular, Vue, etc being able to embed their template compilers (with LSP support) without wrapper compilers and IDE extensions.
e.g. imagine syntax like this being possible (not saying it's good)
Where the `template!` macro instructs the engine how to translate the tokens into their JavaScript syntax and the `#[reactive]` macro converts the class member into a getter/setter that triggers a re-render calculation.
It would need to be adopted by TC39 of course and the expectation would be that, if provided at runtime, a JavaScript engine could handle the preprocessing however transpilers should be able to pre-compute the outputs so they don't need to be evaluated at runtime.
It's not unusable per-se, however being unable to take advantage of Rust's fearless concurrency and having to glue everything together with JavaScript severely restrict the usefulness.
May as well just use TypeScript and React at that point.
The dream is to be able to specify only a wasm file in an html script tag, have the tab consume under 1mb of memory and maximise the use of client hardware to produce a flawless user experience across all types of hardware.
Rust memory management is... profoundly not manual?
Case in point: I use Rust/WASM in all of my web apps to great effect, and memory is never a consideration. In Rust you pretty much never think about freeing or memory.
On top of that, when objects are moved across to be owned by JS, FinalizationRegistry is able to clean up them up pretty much perfectly, so they're GC-ed as normal.
Wrangling the borrow checker seems pretty manual at times. And I don’t know why you’d bother with a persnickety compile time GC when JS’s GC isn’t a top issue for front end development.
It's kinda exhausting to use TypeScript and run into situations where the type system is more of a suggestion than a rule. Passing around values [1] that have a type annotation but aren't the type they're annotated as is... in many ways worse than not typing them in the first place.
[1]: not even deserialized ones - ones that only moved within the language!
There is no need for the concept of ownership of memory in JavaScript. So you are wasting time on a concept that doesn't matter in languages with a real GC. Dealing with ownership = manual memory management.
You can still have ownership issues and leaks even with a GC, if an object is reachable from a root. e.g. object A is in a cache and it references object B which references objects C D E F G ... which will now never get collected.
If A owns B then that is as expected but if A merely references B then it should hold a WeakRef
This used to not be true- once upon a time, Internet Explorer kept memory separate for DOM nodes and JavaScript objects, so it was very easy to leak memory by keeping reference cycles between the two.
Now, with all the desire for WASM to have DOM access I wonder if we'll end up finding ourselves back in that position again.
I have read the examples, and it seems like this cannot be used for aggressive hoisting of conditionals by writing "if (comptime foo)", resulting in the the body of the if statement being executed unconditionally or omitted. So it cannot replace my current use of C preprocessor macros in Javascript, though Zig's actual comptime feature could.
This can be used as part of that step (i.e. converting `foo()` into `true` or `false`), but I think the expectation is that you'll have another step in the build process that automatically strips away `if(false)...` statements and inlines `if(true)` ones.
Almost any minifier will automatically do this, for example, and most can be configured so that they only do constant folding/dead code elimination, so the result will be a file that looks like the one you've written, but with these comptime conditions removed/inlined.
Obviously with C preprocessor macros, you've got one tool that evaluates the condition and removes the dead code, but with comptime you have more flexibility and your conditions are all written in Javascript rather than a mix of JS and preprocessor macros.
I think it is a bit more troublesome to ship a this comptime implementation and a minifier to the client so the specialization may be performed there than it is to ship just a c preprocessor implementation.
That'll depend a lot on the context and the client, I imagine. Comptime and the minifier can be distributed in a fairly standard way as part of an NPM package's dependencies, so if you're shipping to a system that can handle NPM, then comptime doesn't really add much. But if the client doesn't have a JS runtime installed or can't easily access the NPM ecosystem, then I can imagine shipping the C preprocessor could well be easier than juggling two different tools.
Yes it can, the comptime expression inside the if would turn into a `true` or `false` literal, but you would need a separate build tool to optimize away the if. That's partly why comptime.ts outputs TypeScript iirc.
I believe both Vite and Bun bundler would apply the optimization to eliminate constant conditionals when you use comptime.ts as a plugin.
I don't have any experience running the Vite or Bun bundlers or the typescript compiler on the client, but I think these are not really supported use cases.
Why are you trying to run this on the client? At the time you’re shipping code you’re already past the point of comptime. It’d be wasteful to do anything client side beyond execute the code you sent.
One of the most exciting features of Zig, but am I correct that this doesn’t apply to types themselves like comptime generics in Zig? I find that to be one of the most powerful ideas: type level mappings that have the same syntax as the runtime code where you can just set an iteration limit. This would be a great way to get around the “too large union” problem in TS, for example.
I’m talking about in place of a fetch call, you could simply import a json response from an endpoint, there by bypassing the need to call fetch, and you’ll get the response as if it’s imported.
It won’t replace all GET calls certainly but I can think of quite a few first load ones that can simply be import statements once this happens
I have had many discussions with the author and we ultimately decided not to support those kinds of usecases until we have a very solid set of guarantees. Supporting closures can quickly become very tricky when you need to preserve a function across JS processes.
I just want to be able to select dependencies at bundle time depending on the build environment. If its in dev, use `MockService`. If its in prod, use `ProdService`. Right now I just have `index.prod.ts` and `index.dev.ts` that choose the dependencies, which is not a bad solution, I just wish I could keep my initialization code in one file and have functions return the dependencies based on the environment. I can do this at runtime obviously but it doesnt seem to eliminate unused dependencies well
I know its a cursed idea but I often find myself wishing typescript had a C++ style preprocessor
It would have been better to choose a different import attribute altogether.
You’re projecting the mimetype idea from two examples but the proposal is intentionally agnostic about what type might be used for:
> This proposal does not specify behavior for any particular attribute key or value. The JSON modules proposal will specify that type: "json" must be interpreted as a JSON module, and will specify common semantics for doing so. It is expected the type attribute will be leveraged to support additional module types in future TC39 proposals as well as by hosts.
I literally just want Rust style macros and proc macros in JavaScript. e.g. using
``` const MyComponent = () => jsx!(<div></div>) ```
rather than a .tsx file.
That or wasm to be usable so I can just write my web apps in Rust
Maybe the (relative) lack of ecosystem has kept you away, but I really recommend checking out both Dioxus and Leptos. Leptos is incredibly similar to React, but with Rust ergonomics, and it's been a pleasure to learn and use. With an LLM by my side that knows React and Rust pretty well, I've found myself not even needing the React libraries that I thought I would, since I can easily build on the fly the features/components I actually need.
I too, eventually gave up on React <> WASM <> Rust but I was able to port all my existing React over into Leptos in a few hours.
Yeah they are great, it's more the poor integration and lack of parallelism that makes it not worthwhile.
Thunking everything through JavaScript and not being able to take advantage of fearless concurrency severely restrict the use-cases. May as well just use TypeScript and React at that point
The bun and other authors would probably do well to not repurpose already understood terminology. "Macros" are already understood to be code that produces other code. "Comptime" is a nice alternative, but bun's "macros" aren't macros in that sense.
We had sweet-js macros as a library many years ago but it looks like it went nowhere, especially after an incompatible rewrite that (afaik) remains broken for even basic cases. (Caveat: been a while since I looked at it)
No need for macros.
https://github.com/lite-jsx/core
Every once in a while I get a strong urge to hack on sweet.js to add typescript support
That particular example is odd. What are you gaining by having a macro that needs a compile step vs no macro and just configuring your compile step to use a JSX loader for js files?
The general idea is something like prebaking computation into your deployed JS/TS. This is much more general than JSX-related tools, and a lot cheaper to run. In JS applications I often find myself doing various small bits of work on startup, comptime.ts would move all these bits into build-time.
Oh, I get the value of comptime! I was specifically responding to the rust-like macros comment
There are quite a lot of valid use cases to being able to transform arbitrary tokens into JavaScript at "compile" time. One that already exists is JSX, which is a macro that is baked into the TypeScript compiler but is restricted/tailored to React-style libraries.
We sort of get around this today using template literals and eval, but it's janky. https://github.com/developit/htm
A generic macro system could open the door to a framework like Svelte, Angular, Vue, etc being able to embed their template compilers (with LSP support) without wrapper compilers and IDE extensions.
e.g. imagine syntax like this being possible (not saying it's good)
```
export class MyComponent {
}svelte.init(MyComponent, document.body)
```
Where the `template!` macro instructs the engine how to translate the tokens into their JavaScript syntax and the `#[reactive]` macro converts the class member into a getter/setter that triggers a re-render calculation.
It would need to be adopted by TC39 of course and the expectation would be that, if provided at runtime, a JavaScript engine could handle the preprocessing however transpilers should be able to pre-compute the outputs so they don't need to be evaluated at runtime.
Writing a web app at the moment with C++/Emscripten. What makes wasm unusable in Rust?
It's not unusable per-se, however being unable to take advantage of Rust's fearless concurrency and having to glue everything together with JavaScript severely restrict the usefulness.
May as well just use TypeScript and React at that point.
The dream is to be able to specify only a wasm file in an html script tag, have the tab consume under 1mb of memory and maximise the use of client hardware to produce a flawless user experience across all types of hardware.
You want manual memory management for your web apps?
Rust memory management is... profoundly not manual?
Case in point: I use Rust/WASM in all of my web apps to great effect, and memory is never a consideration. In Rust you pretty much never think about freeing or memory.
On top of that, when objects are moved across to be owned by JS, FinalizationRegistry is able to clean up them up pretty much perfectly, so they're GC-ed as normal.
Wrangling the borrow checker seems pretty manual at times. And I don’t know why you’d bother with a persnickety compile time GC when JS’s GC isn’t a top issue for front end development.
You stop noticing the borrow checker after a while and being able to write insanely parallel/performant code is quite rewarding.
Again, not all websites need to be usable on low end hardware/have a 1mb memory footprint - but there are a lot of use cases that would benefit.
Think, browser extensions that load on every tab and consume 150mb+ * number of tabs open and shares the main thread with the website.
ServiceWorkers that sit as background processes in your OS even when the browser is closed, that sort of thing.
I use Rust for all the other reasons, real types being a major one of them:
https://hn.algolia.com/?type=comment&query=typescript%20soun...
It's kinda exhausting to use TypeScript and run into situations where the type system is more of a suggestion than a rule. Passing around values [1] that have a type annotation but aren't the type they're annotated as is... in many ways worse than not typing them in the first place.
[1]: not even deserialized ones - ones that only moved within the language!
The borrow checker just verifies that you're handling the concept of ownership of memory correctly.
The actual management of memory- allocating, reclaiming, etc - are all handled automagically for you.
There is no need for the concept of ownership of memory in JavaScript. So you are wasting time on a concept that doesn't matter in languages with a real GC. Dealing with ownership = manual memory management.
You can still have ownership issues and leaks even with a GC, if an object is reachable from a root. e.g. object A is in a cache and it references object B which references objects C D E F G ... which will now never get collected.
If A owns B then that is as expected but if A merely references B then it should hold a WeakRef
This used to not be true- once upon a time, Internet Explorer kept memory separate for DOM nodes and JavaScript objects, so it was very easy to leak memory by keeping reference cycles between the two.
Now, with all the desire for WASM to have DOM access I wonder if we'll end up finding ourselves back in that position again.
I really really (really) don’t want Rust style macros and proc macros in JavaScript (or TypeScript), ever.
Might be a good idea to advocate for faster progress in wasm so fans of the feature don't try to pollute the language :p
I have read the examples, and it seems like this cannot be used for aggressive hoisting of conditionals by writing "if (comptime foo)", resulting in the the body of the if statement being executed unconditionally or omitted. So it cannot replace my current use of C preprocessor macros in Javascript, though Zig's actual comptime feature could.
This can be used as part of that step (i.e. converting `foo()` into `true` or `false`), but I think the expectation is that you'll have another step in the build process that automatically strips away `if(false)...` statements and inlines `if(true)` ones.
Almost any minifier will automatically do this, for example, and most can be configured so that they only do constant folding/dead code elimination, so the result will be a file that looks like the one you've written, but with these comptime conditions removed/inlined.
Obviously with C preprocessor macros, you've got one tool that evaluates the condition and removes the dead code, but with comptime you have more flexibility and your conditions are all written in Javascript rather than a mix of JS and preprocessor macros.
I think it is a bit more troublesome to ship a this comptime implementation and a minifier to the client so the specialization may be performed there than it is to ship just a c preprocessor implementation.
That'll depend a lot on the context and the client, I imagine. Comptime and the minifier can be distributed in a fairly standard way as part of an NPM package's dependencies, so if you're shipping to a system that can handle NPM, then comptime doesn't really add much. But if the client doesn't have a JS runtime installed or can't easily access the NPM ecosystem, then I can imagine shipping the C preprocessor could well be easier than juggling two different tools.
Yes it can, the comptime expression inside the if would turn into a `true` or `false` literal, but you would need a separate build tool to optimize away the if. That's partly why comptime.ts outputs TypeScript iirc.
I believe both Vite and Bun bundler would apply the optimization to eliminate constant conditionals when you use comptime.ts as a plugin.
I don't have any experience running the Vite or Bun bundlers or the typescript compiler on the client, but I think these are not really supported use cases.
Why are you trying to run this on the client? At the time you’re shipping code you’re already past the point of comptime. It’d be wasteful to do anything client side beyond execute the code you sent.
I could imagine this being useful for pre-compiling markdown.
One of the most exciting features of Zig, but am I correct that this doesn’t apply to types themselves like comptime generics in Zig? I find that to be one of the most powerful ideas: type level mappings that have the same syntax as the runtime code where you can just set an iteration limit. This would be a great way to get around the “too large union” problem in TS, for example.
Interesting. I've never seen the import-with syntax, though and it's hard to find any documentation on it. Is this a syntax extension?
It’s been introduced as part of ecmascript 2026 https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...
It first started as an assert statement[0] for those who may have seen that, these type statements are an evolution out of that proposal.
I do wonder if this makes the importable gets (via type: json) a reality like assert was going to.
[0]: https://v8.dev/features/import-assertions
> I do wonder if this makes the importable gets (via type: json) a reality like assert was going to.
Yes, the JSON modules proposal is finished.
https://github.com/tc39/proposal-json-modules
https://caniuse.com/mdn-javascript_statements_import_import_...
An entire class fetch requests will go away with importable gets. I am excited for this
In node you could always require("food.json")
Not what I am talking about though.
I’m talking about in place of a fetch call, you could simply import a json response from an endpoint, there by bypassing the need to call fetch, and you’ll get the response as if it’s imported.
It won’t replace all GET calls certainly but I can think of quite a few first load ones that can simply be import statements once this happens
Ohh right. That makes sense.
Would be really great if it could return named functions
I have had many discussions with the author and we ultimately decided not to support those kinds of usecases until we have a very solid set of guarantees. Supporting closures can quickly become very tricky when you need to preserve a function across JS processes.
I just want to be able to select dependencies at bundle time depending on the build environment. If its in dev, use `MockService`. If its in prod, use `ProdService`. Right now I just have `index.prod.ts` and `index.dev.ts` that choose the dependencies, which is not a bad solution, I just wish I could keep my initialization code in one file and have functions return the dependencies based on the environment. I can do this at runtime obviously but it doesnt seem to eliminate unused dependencies well
I know its a cursed idea but I often find myself wishing typescript had a C++ style preprocessor
Sweet. No need a framework to do that.