Large language models (LLMs) like GPT-4 can already generate production-quality front-end code in Vue, React, Svelte - even full apps with routing and state management. But those outputs still target today’s frameworks, which carry nontrivial boilerplate, build steps, and runtime overhead. What if we asked the LLM to sidestep existing abstractions and write vanilla JavaScript instead? Or better yet, target a purpose-built “LLM-First” mini-framework that minimizes token usage, build complexity, and runtime cost?
In this post we’ll:
.vue
file with <template>
, <script>
, optional <style>
.Pros:
Cons:
<script setup>
import { ref } from 'vue'
const count = ref(0)
function increment() {
count.value++
}
</script>
<template>
<div>
<p>Count: {{ count }}</p>
<button @click="increment">Increment</button>
</div>
</template>
// reactive.js (500 bytes)
export function reactive(obj, onChange) {
return new Proxy(obj, {
set(target, key, val) {
target[key] = val;
onChange();
return true;
}
});
}
// counter.js
import { reactive } from './reactive.js';
const state = reactive({ count: 0 }, render);
function increment() {
state.count++;
}
function render() {
document.body.innerHTML = `
<p>Count: ${state.count}</p>
<button id="inc">Increment</button>
`;
document.getElementById('inc').onclick = increment;
}
render();
.js
.Key takeaway: vanilla JS cuts bundle size by an order of magnitude and trims down the prompt by ~30%.
Metric | Vue 3 | Vanilla JS |
---|---|---|
Prompt length | ~120 tokens | ~80 tokens |
Bundle size (gz) | ~7 kB | ~0.5 kB |
Build step | yes (~500 ms) | no |
Dev feedback loop | slower | instant reload |
Clearly, boilerplate and runtime code in mainstream frameworks add both token and payload overhead. But vanilla JS loses out on ergonomics once your app grows beyond trivial widgets.
What if we formalize a tiny runtime with primitives that:
// lfm.js (~1 kB minified)
export const createApp // takes root element
export const h // hyperscript helper
export const reactive, effect
Usage pattern:
import { createApp, h, reactive, effect } from 'lfm.js';
const state = reactive({ todos: [] });
effect(() => {
const list = state.todos.map(todo => h('li', {}, todo.text));
app.render(h('ul', {}, ...list));
});
const app = createApp(document.body);
app.run(); // wires up reactivity
Approach | Prompt ↓ | Bundle ↓ | Iteration ↓ |
---|---|---|---|
Vue / React / Svelte | baseline | baseline | baseline |
Vanilla JS | –30% | –90% | +Instant |
LLM-First LFM | –50% | –85% (vs Vanilla) | +Instant |
Insight: By standardizing on a tiny, well-documented API, we give the LLM a narrow, familiar “vocabulary.” That slashes the prompt, cuts boilerplate, and still gives you reactive power.