Should we learn the basic programming concepts if AI writes the code instead of us?

What is the main question nowadays: Developers vs. AI? Or the right question: Developers with AI?

Vladimir Topolev
Numatic Ventures

--

We live in a new era where AI may generate code for us in many cases, significantly impacting how we program. This raises the question - is it worth spending time learning basic programming skills if AI can do it for us, or would it be a waste of time? Should we instead focus on other things?

I will ask ChatGPT to write a code in NodeJS that takes one file and copies the file’s content into N other files. This prompt looks like this:

PROMT 1: I have one Readable NodeJS Stream and I want to write it into N different Writable stream

The response looks like that:

const fs = require('fs');
const { pipeline } = require('stream');

// Define your readable stream (e.g., reading from a file)
const readableStream = fs.createReadStream('input.txt');

// Define N writable streams (e.g., writing to multiple output files)
const writableStreams = [];
for (let i = 0; i < N; i++) {
writableStreams.push(fs.createWriteStream(`output${i}.txt`));
}

// Pipeline the readable stream to each of the writable streams
pipeline(
readableStream,
...writableStreams,
(err) => {
if (err) {
console.error('Pipeline failed:', err);
} else {
console.log('Pipeline succeeded.');
}
}
);

Have a look at this snippet attentively; it creates a pipeline where the first stream is Readable and puts N Writable streams afterward:

Readable -> Writable1 -> Writable2 -> ... -> WritableN

If you are familiar with Streams and Pipelines, you already know this code won’t work as expected. We can pipeline from one Readable stream to one Writable stream using Transforms/Duplex streams in between.

Okay, but even if we don’t know this fact, we may run this code and find out that it throws an error:

TypeError [ERR_INVALID_ARG_TYPE]: The "val" argument must be an instance of 
Readable, Iterable, AsyncIterable, ReadableStream, or TransformStream.
Received an instance of WriteStream

It’s not a problem since we may ask ChatGPT again that this code snippet does not work, and it throws the error:

PROMT 2: This code does not work and it throws the error: TypeError [ERR_INVALID_ARG_TYPE]: The “val” argument must be an instance of Readable, Iterable, AsyncIterable, ReadableStream, or TransformStream. Received an instance of WriteStream

In this case, I got the code that works as expected:

const fs = require('fs');

// Create a readable stream (replace 'source.txt' with your actual source file)
const readableStream = fs.createReadStream('source.txt');

// Create an array to hold N writable streams
const writableStreams = [];
for (let i = 0; i < N; i++) {
// Replace 'output' with the desired output file names or destinations
writableStreams.push(fs.createWriteStream(`output_${i}.txt`));
}

// Define a function to pipe the readable stream to each writable stream
const pipeToStreams = (readable, writableStreams) => {
for (const writableStream of writableStreams) {
readable.pipe(writableStream);
}
};

// Call the function to pipe the readable stream to each writable stream
pipeToStreams(readableStream, writableStreams);

But this code has one drawback — the backpressure issue.

Backpressure is a common problem because the read operation is typically faster than the write operation. Without a mechanism to slow down the readable stream, it can lead to a lack of available memory and cause the application to crash.

My next step was to ask ChatGPT to take into account backpressure:

PROMT 3: It does not take into account backpressure, could you fix it, please?

The response was:

You’re correct; the previous solution doesn’t consider backpressure, which is an important aspect of stream handling, especially when dealing with multiple writable streams.

const fs = require('fs');

// Define your readable stream (e.g., reading from a file)
const readableStream = fs.createReadStream('input.txt');

// Define N writable streams (e.g., writing to multiple output files)
const writableStreams = [];
for (let i = 0; i < N; i++) {
writableStreams.push(fs.createWriteStream(`output${i}.txt`));
}

// Define a function to write data to all writable streams
const writeToStreams = (data) => {
writableStreams.forEach(stream => {
// Check if the stream is writable
if (stream.writable) {
// Write data to the stream
const result = stream.write(data);
// If the stream's buffer is full, pause the readable stream
if (!result) {
readableStream.pause();
// Once the stream drains, resume reading from the readable stream
stream.once('drain', () => {
readableStream.resume();
});
}
}
});
};

// Listen for data events on the readable stream and write to all writable streams
readableStream.on('data', (data) => {
writeToStreams(data);
});

// Handle end of the readable stream
readableStream.on('end', () => {
// Close all writable streams
writableStreams.forEach(stream => {
stream.end();
});
});

// Handle errors
readableStream.on('error', (err) => {
console.error('Readable stream error:', err);
// Close all writable streams on error
writableStreams.forEach(stream => {
stream.destroy();
});
});

Conclusions:

We should have a look at ChatGPT as our assistant that is capable of speeding up our development process. I think about it as a cool version of Google, when instead of digging into dozens of files — I may have a concise and comprehensive response and if it’s needed, ask for some specific details in the provided context afterward. It significantly improves our development process.

--

--

Vladimir Topolev
Numatic Ventures

Addicted Fullstack JS engineer. Love ReactJS and everything related to animation