NodeJS on the Frontend
What if you could import a generic NodeJS function into your frontend code? I'll show you how I made it possible.
What if we could import a generic NodeJS function into our frontend code? In this example, statuses
returns results from Firestore.
import { statuses } from './statuses.server.js'
export const Statuses = () => {
if(loading) return <div>Loading...</div>
return (
<div
onClick={() => {
let statuses = await statuses('timeline')
console.log(statuses)
}}
>
Load Statuses
</div>
)
}
Instead of writing network requests, wouldn't it be nice to write full stack code, and import it like a normal part of the frontend? Keep reading to find out how I got this working in a real application.
Contents
How things are done currently
In traditional apps, our frontend might have a file called Statuses.js
that looks like this:
import React from 'react'
export const Statuses = () => {
let statuses = ['first', 'second', 'third']
return statuses.map(status => (
<div key={status}>{status}</div>
))
}
Next, we decide to build a backend route. It might look like this:
app.get('/statuses', (req, res) => {
// load some statuses from a database or other service
// hard code it for now
let statuses = ['first', 'second', 'third']
res.send(statuses)
}
Then you need to do something like this on your frontend:
let loadStatuses = async () => {
let response = await fetch('https://myserver.com/statuses')
let json = await response.json()
return json
}
There's a bit more work to use a network request in React, but we won't get into it. The point is, now your server code could change, and break the frontend. Wouldn't it be cool to directly import the function, and use TypeScript like normal, to prevent anything from breaking? Whether you use GraphQL, or whatever else, you need to go into your backend project, maybe it's in the same repo.... but I almost guarantee that it's not sitting right next to the code that uses it.
Wouldn't it be great if we could co-locate our server logic with our frontend?
We already put CSS, network requests, and everything else together. It would be interesting if we could put an entire feature, across the stack, together in the same code location.
GRPC
I got an idea after learning and using grpc. The fundamental part of GRPC, is having a single source of truth. In its case, a proto file. This file contains type definitions for GRPC functions. What parameters they need, what data is returned, and how it will be returned.
It's like TypeScript, but across any language. You create this one type definition, and then you auto generate code in whatever language you want. Maybe we want to use Rust for one GRPC backend. And then we want to call that function from our web frontend. We would create a proto definition. We would create the server function using the proto. Then we would auto generate a TypeScript function that we can call on the web. It looks a bit like this:
import * as grpcWeb from 'grpc-web';
import {EchoServiceClient} from './echo_grpc_web_pb';
import {EchoRequest, EchoResponse} from './echo_pb';
const echoService = new EchoServiceClient('http://localhost:8080', null, null);
const request = new EchoRequest();
request.setMessage('Hello World!');
const call = echoService.echo(request, {'custom-header-1': 'value1'},
(err: grpcWeb.Error, response: EchoResponse) => {
console.log(response.getMessage());
});
call.on('status', (status: grpcWeb.Status) => {
// ...
});
If you ask me.... that's some really ugly syntax that could be boiled down to:
let response = await echoService.echo({message: 'Hello World'})
This is one of my problems with GRPC... its official web package is written by people who clearly don't use JS often, or love verbose syntaxes. I shouldn't complain, as anyone could make their own web generator from proto, and make the syntax better. I just don't have the time or will for that one ;)
Why did I bring up GRPC? Well, it's cool, it supports bi-directional streaming, and well... it's the whole reason I got this idea! GRPC lets you call functions as if they exist locally in your application. That's the fundamental part of it. Write a grpc server or client in any language, and just call an async function to get some data.
My Idea
Imagine not needing to handle anything to do with a backend Node server. By running a dev server, it builds your web code, and automatically pulls out .server.js
files and runs them using Node. In production you would npm start
and your app's server would know to pick up these files and handle calling them when the frontend needs them.
I promise that this syntax IS possible and I have it working. Write normal functions, put .server.js
on the end of them, that imported code becomes a promise, and runs on a server!
Webpack
webpack is a powerful tool that handles bundling application level JS. It has a simple concept called loaders. A loader is just a function that receives all the content of every file imported in your app, and lets you do something with it. The most fundamental ones out there are ts-loader
babel-loader
css-loader
, etc.
To make this crazy cool idea work, we need to make a loader. Here's how I started.
const path = require("path");
const HtmlWebpackPlugin = require("html-webpack-plugin");
module.exports = {
mode: "development",
entry: "./src/index.js",
module: {
rules: [
{
test: /\.js$/,
use: [
{
loader: path.resolve("./config/magic/serverLoader.js"),
},
],
},
],
},
plugins: [new HtmlWebpackPlugin({ template: "./src/index.html" })],
};
I also had babel-loader
in here so I could render React components and parse JSX, but this is the core webpack.config.js
All I'm doing is telling webpack to run in development mode, and to pass all files ending with .js
into my magical loader. After, I'm setting up HtmlWebpackPlugin
to generate an html file that includes out JS from webpack.
Server Loader
My breakthrough thought in this whole process was: What if I replace the contents of the file with a fetch request.... and then send the function name over the network, with the args passed in? How would I even write that....
module.exports = function (source) {
if (!this.resourcePath.match(".server")) return source;
let lines = source.split(/\n/);
}
Here's how I started. Let's skip all files that don't have .server
in the name. Then lets split the file into individual lines. We want to find all the exported variable names.... One way would be to parse the JS to an AST, but I thought I could get away without needing to parse the files that way.
//get the export variable names
let exports = lines
.filter((line) => line.match("export"))
.map((exportLine) => {
let [, , name] = exportLine.split(" ");
return name;
});
So that's what I wrote next. Match on the lines that contain the text export
in them. Then split that line, and grab the name of the export. export const Blah = ...
Since the syntax always has export
followed by var
let
const
then name
, this simple parsing should work just fine.
Really all that's left is replacing those exports with some network code.
let result = exports
.map((exportName) => (
`export const ${exportName} = serverRequest('${exportName}')`;
))
.join(/\n/);
If I was making a production library, I would probably throw import serverRequest from 'mylibrary'
and actually make the function as a part of my library. For this proof of concept I will do it in the loader. Here's the final result
serverLoader.js
module.exports = function (source) {
if (!this.resourcePath.match(".server")) return source;
let lines = source.split(/\n/);
//get the export variable names
let exports = lines
.filter((line) => line.match("export"))
.map((exportLine) => {
let [, , name] = exportLine.split(" ");
return name;
});
let result = exports
.map((exportName) => {
return `export const ${exportName} = serverRequest('${exportName}')`;
})
.join(/\n/);
let file = `
const serverRequest = (name) => async (...args) => {
let response = await fetch('http://localhost:8000', {
headers: {
'Content-type': 'application/json',
},
method: 'POST',
body: JSON.stringify({name, args})
})
let json = await response.json()
return json
}
${result}
`;
return file;
};
Think about it, from the end user perspective, they are importing a normal function, the call signature is the exact same, TypeScript will correctly check all the calls to our server exports too! With this loader, all the function calls will now be a promise and act as expected.
Gotchas
There's a few things to note. If you are passing anything to your server functions that can't be parsed to json, like formdata or something, it won't work. It could be added without much effort, but this example doesn't do it. Also, I'm not handling errors either, which we would want to do in a real framework. Lastly, we would want to set the fetch url from an environment variable or configuration variable so that it can be changed for production builds.
Node
Well, we have the frontend part working, how do we make the server part, that automatically works?
var bodyParser = require("body-parser");
var express = require("express");
var cors = require("cors");
var bodyParser = require("body-parser");
const glob = require("glob");
var app = express();
app.use(cors());
app.use(bodyParser.json());
let functions = {};
glob(process.cwd() + "/**/*.server.js", {}, async (err, files) => {
files.forEach((file) => {
let exports = require(file);
functions = { ...functions, ...exports };
});
});
I threw this together pretty quick. If we use esm, we can handle importing es modules in Node as is. So we just need to run this script like so: node -r esm server.js
. We setup express, add cors, and parse json body data. Pretty standard.
Next up, I do a recursive search for all files with .server.js
in the current directory. Then, I import all those files, and add every exported method to an object. If we have two .server.js
files, and they export test
and test2
respectively, the functions
object will look like this:
{
test: () => {},
test2: () => {}
}
By adding all these exports to a single object, we can grab the name
passed in from our JSON body that our server loader added, and complete this server! Here's the remaining code:
app.post("/", async (req, res) => {
try {
let data = await functions[req.body.name](...req.body.args);
res.send(data);
} catch (e) {
console.error(e);
res.send({ error: "Function not found" });
}
});
app.listen(process.env.PORT || 8000, () =>
console.log(
`ClearScript Server running at http://localhost:${process.env.PORT || 8000}`
)
);
Recap
So let's recap this:
Write nodejs code, make sure to name the file
something.server.js
Webpack loader replaces all exports in
.server.js
files, and replaces them with exported fetch call, with the function nameWhen a function is called, it sends
export name
andarguments
to the serverOur server requires all
.server.js
files, and adds their exports to one objectWhen a request comes in, we invoke
functions[name](args)
I already have some prototype code that takes this idea a step further. When you run your webpack dev server, it runs the node server, and hot reloads when you change code. So the developer doesn't have to think about anything when making server code.
Releasing to production would involve the normal frontend build process, and then running this special node server that we demo'd here.
Thoughts?
Phew... what are your thoughts after reading? Is this practical? My opinion is yes. I wouldn't be surprised if Next.js took this idea, since they have already brought api routes a bit closer to the frontend. The only problem I see is new developers getting confused, and blurring the lines a little TOO much between client and server. But I'm crazy, I like thinking up a new magical syntax and way of doing things.
Send me your thoughts by replying to the newsletter, or tweeting me.