Node.js长进程运行了两次

时间:2018-03-14 00:11:45

标签: node.js

我有一个在express.js框架中构建的Node.js restful API。它通常由pm2托管。

其中一项服务有很长的过程。当前端称为服务时,该过程启动。由于数据库中存在错误,因此该过程无法正常完成并且将捕获错误。但是,在进程到达错误之前,另一个完全相同的进程以相同的参数启动。所以在此期间,两个进程都在运行,而一个进程在另一个进程之前。很长一段时间后,第一个进程到达错误点并返回错误。然后第二个返回完全相同的东西。

我检查了前端网络,发现实际上只发送了一个请求。第二个请求来自哪里?

修改1:

整个过程是:第一个进程向db发送查询 - >等待很久 - >第二个过程启动 - >第二个进程向db发送查询 - >等待很久 - >第一个进程接收数据库响应 - >等待很久 - >第二个进程接收数据库响应

编辑2:

服务代码如下:

import { Express, Request, Response } from "express";
import * as multer from "multer";
import * as fs from "fs";
import { Readable, Duplex } from "stream";
import * as uid from "uid";
import { Client } from "pg";
import * as gdal from "gdal";
import * as csv from "csv";

import { SuccessPayload, ErrorPayload } from "../helpers/response";
import { postgresQuery } from "../helpers/database";
import Config from "../config";

export default class ShapefileRoute {
    constructor(app: Express) {

        // Upload a shapefile
        /**
          * @swagger
          * /shapefile:
          *   post:
          *     description: Returns the homepage
          *     responses:
          *       200:
          */
        app.post("/shapefile", (req: Request, res: Response, next: Function): void => {

            // Create instance of multer
            const multerInstance = multer().array("files");

            multerInstance(req, res, (err: Error) => {
                if (err) {
                    let payload: ErrorPayload = {
                        code: 4004,
                        errorMessage: "Multer upload file error.",
                        errorDetail: err.message,
                        hints: "Check error detail"
                    };

                    req.reservePayload = payload;

                    next();

                    return;
                }
                // Extract files
                let files: any = req.files;

                // Extract body
                let body: any = JSON.parse(req.body.filesInfo);

                // Other params
                let writeFilePromises: Promise<any>[] = [];

                let copyFilePromises: Promise<any>[] = [];

                let rootDirectory: string = Config.uploadRoot;

                let outputId: string = uid(4);

                // Reset index of those files
                let namesIndex: string[] = [];
                files.forEach((item: Express.Multer.File, index: number) => {
                    if(item.originalname.split(".")[1] === "csv" || item.originalname.split(".")[1] === "txt" || item.originalname.split(".")[1] === "shp") {
                        namesIndex.push(item.originalname);
                    }
                })

                // Process and write all files to disk
                files.forEach((item: Express.Multer.File, outterIndex: number) => {
                    if(item.originalname.split(".")[1] === "csv" || item.originalname.split(".")[1] === "txt") {
                        namesIndex.forEach((indexItem, index) => {
                            if(indexItem === item.originalname) {
                                ShapefileRoute.csv(item, index, writeFilePromises, body, rootDirectory, outputId,);
                            }
                        })
                    } else if (item.originalname.split(".")[1] === "shp") {
                        namesIndex.forEach((indexItem, index) => {
                            if(indexItem === item.originalname) {
                                ShapefileRoute.shp(item, index, writeFilePromises, body, rootDirectory, outputId,);
                            }
                        })
                    } else {
                        ShapefileRoute.shp(item, outterIndex, writeFilePromises, body, rootDirectory, outputId,);
                    }
                })

                // Copy files from disk to database
                ShapefileRoute.copyFiles(req, res, next, writeFilePromises, copyFilePromises, req.reserveSuperPg, () => {
                    ShapefileRoute.loadFiles(req, res, next, copyFilePromises, body, outputId)
                });


            })
        });
    }

    // Process csv file
    static csv(file: Express.Multer.File, index: number, writeFilePromises: Promise<any>[], body: any, rootDirectory: string, outputId: string) {

        // Streaming file to pivotcsv
        writeFilePromises.push(new Promise((resolve, reject) => {

            // Get specification from body
            let delimiter: string;
            let spec: any;
            let lrsColumns: string[] = [null, null, null, null, null, null];
            body.layers.forEach((jsonItem, i) => {
                if (jsonItem.name === file.originalname.split(".")[0]) {
                    delimiter = jsonItem.file_spec.delimiter;
                    spec = jsonItem
                    jsonItem.lrs_cols.forEach((lrsCol) => {
                        switch(lrsCol.lrs_type){
                            case "rec_id":
                            lrsColumns[0] = lrsCol.name;
                            break;
                            case "route_id":
                            lrsColumns[1] = lrsCol.name;
                            break;
                            case "f_meas":
                            lrsColumns[2] = lrsCol.name;
                            break;
                            case "t_meas":
                            lrsColumns[3] = lrsCol.name;
                            break;
                            case "b_date":
                            lrsColumns[4] = lrsCol.name;
                            break;
                            case "e_date":
                            lrsColumns[5] = lrsCol.name;
                            break;
                        }
                    })
                }
            });

            // Pivot csv file
            ShapefileRoute.pivotCsv(file.buffer, `${rootDirectory}/${outputId}_${index}`, index, delimiter, outputId, lrsColumns, (path) => {
                console.log("got pivotCsv result");
                spec.order = index;
                resolve({
                    path: path,
                    spec: spec
                });
            }, reject);
        }));
    }

    // Process shapefile
    static shp(file: Express.Multer.File, index: number, writeFilePromises: Promise<any>[], body: any, rootDirectory: string, outputId: string) {

        // Write file to disk and then call shp2csv to gennerate csv
        writeFilePromises.push(new Promise((resolve, reject) => {

            // Write shpefile to disk
            fs.writeFile(`${rootDirectory}/shps/${file.originalname}`, file.buffer, (err) => {

                // If it is .shp file, resolve it's path and spec
                if(file.originalname.split(".")[1] === "shp") {
                    // Find spec of the shapefile from body
                    body.layers.forEach((jsonItem, i) => {
                        if (jsonItem.name === file.originalname.split(".")[0]) {
                            let recordColumn: string = null;
                            let routeIdColumn: string = null;
                            jsonItem.lrs_cols.forEach((lrsLayer) => {
                                if (lrsLayer.lrs_type === "rec_id") {
                                    recordColumn = lrsLayer.name;
                                }
                                if (lrsLayer.lrs_type === "route_id") {
                                    routeIdColumn = lrsLayer.name;
                                }
                            })

                            // Transfer shp to csv
                            ShapefileRoute.shp2csv(`${rootDirectory}/shps/${file.originalname}`, `${rootDirectory}/${outputId}_${index}`, index, outputId, recordColumn, routeIdColumn, (path, srs) => {

                                // Add coordinate system, geom column and index of this file to spec
                                jsonItem.file_spec.proj4 = srs;
                                jsonItem.file_spec.geom_col = "geom";
                                jsonItem.order = index;

                                // Return path and spec
                                resolve({
                                    path: path,
                                    spec: jsonItem
                                })
                            }, (err) => {
                                reject;
                            })
                        }
                    });
                } else {
                    resolve(null);
                }
            })
        }));
    }

    // Copy files to database
    static copyFiles(req: Request, res: Response, next: Function, writeFilePromises: Promise<any>[], copyFilePromises: Promise<any>[], client: Client, callback: () => void) {
        // Take all files generated by writefile processes
        Promise.all(writeFilePromises)
        .then((results) => {

            // Remove null results. They are from .dbf .shx etc of shapefile.
            const files: any = results.filter(arr => arr);

            // Create promise array. This will be triggered after all files are written to database.
            files.forEach((file) => {
                copyFilePromises.push(new Promise((copyResolve, copyReject) => {
                    let query: string = `copy lbo.lbo_temp from '${file.path}' WITH NULL AS 'null';`;

                    // Create super user call
                    postgresQuery(client, query, (data) => {
                        copyResolve(file.spec);
                    }, copyReject);
                }));
            });

            // Trigger upload query
            callback()
        })
        .catch((err) => {
            // Response as error if any file generating is wrong
            let payload: ErrorPayload = {
                code: 4004,
                errorMessage: "Something wrong when processing csv and/or shapefile.",
                errorDetail: err.message,
                hints: "Check error detail"
            };

            req.reservePayload = payload;

            next();

        })
    }


    // Load layers in database
    static loadFiles(req: Request, res: Response, next: Function, copyFilePromises: Promise<any>[], body: any, outputId: string) {

        Promise.all(copyFilePromises)
        .then((results) => {

            // Resort all results by the order assigned when creating files
            results.sort((a, b) => {
                return a.order - b.order;
            });
            results.forEach((result) => {
                delete result.order;
            });

            // Create JSON for load layer database request
            let taskJson = body;
            taskJson.layers = results;
            let query: string = `select lbo.load_layers2(p_session_id := '${outputId}', p_layers := '${JSON.stringify(taskJson)}'::json)`;

            postgresQuery(req.reservePg, query, (data) => {
                // Get result
                let result = data.rows[0].load_layers2.result;

                // Return 4003 error if no result
                if (!result) {
                    let payload: ErrorPayload = {
                        code: 4003,
                        errorMessage: "Load layers error.",
                        errorDetail: data.rows[0].load_layers2.error ? data.rows[0].load_layers2.error.message : "Load layers returns no result.",
                        hints: "Check error detail"
                    };

                    req.reservePayload = payload;

                    next();

                    return;
                }
                let payload: SuccessPayload = {
                    type: "string",
                    content: "Upload files done."
                };

                req.reservePayload = payload;

                next();
            }, (err) => {
                req.reservePayload = err;

                next();
            });

        })
        .catch((err) => {
            // Response as error if any file generating is wrong
            let payload: ErrorPayload = {
                code: 4004,
                errorMessage: "Something wrong when copy files to database.",
                errorDetail: err,
                hints: "Check error detail"
            };

            req.reservePayload = payload;

            next();
        })

    }

    // Pivot csv process. Write output csv to disk and return path of the file.
    static pivotCsv(buffer: Buffer, outputPath: string, inputIndex: number, delimiter: string, outputId: string, lrsColumns: string[], callback: (path: string) => void, errCallback: (err: Error) => void) {

        let inputStream: Duplex = new Duplex();

        // Define output stream
        let output = fs.createWriteStream(outputPath, {flags: "a"});
        // Callback when output stream is done
        output.on("finish", () => {
            console.log("output stream finish");
            callback(outputPath);
        });

        // Define parser stream
        let parser = csv.parse({
            delimiter: delimiter
        });
        // Close output stream when parser stream is end
        parser.on("end", () => {
            console.log("parser stream end");
            output.end();
        });
        // Write data when a chunck is parsed
        let header = [null, null, null, null, null, null];
        let attributesHeader = [];
        let i = 0;
        let datumIndex: boolean = true;
        parser.on("data", (chunk) => {
            console.log("parser received on chunck: ", i);
            if (datumIndex) {
                chunk.forEach((datum, index) => {
                    if (lrsColumns.includes(datum)) {
                        header[lrsColumns.indexOf(datum)] = index;
                    } else {
                        attributesHeader.push({
                            name: datum,
                            index: index
                        })
                    }
                });
                datumIndex = false;
            } else {
                i ++;
                // let layer_id = ;
                let rec_id = header[0] ? chunk[header[0]] : i;
                let route_id = header[1] ? chunk[header[1]] : null;
                let f_meas = header[2] ? chunk[header[2]] : null;
                let t_meas = header[3] ? chunk[header[3]] : null;
                let b_date = header[4] ? chunk[header[4]] : null;
                let e_date = header[5] ? chunk[header[5]] : null;

                let attributes = {};

                attributesHeader.forEach((attribute) => {
                    attributes[attribute.name] = chunk[attribute.index];
                });

                let attributesOrdered = {};
                Object.keys(attributes).sort().forEach((key) => {
                    attributesOrdered[key] = attributes[key];
                });

                let outputData = `${outputId}\t${inputIndex}\t${rec_id}\t${route_id}\tnull\t${f_meas}\t${t_meas}\t${b_date}\t${e_date}\tnull\t${JSON.stringify(attributesOrdered)}\n`;

                output.write(outputData);
            }
        });



        inputStream.push(buffer);
        inputStream.push(null);
        inputStream.pipe(parser);
    }

    // Write shp and transfer to database format. Return file path and projection.
    static shp2csv(inputPath: string, outputPath: string, i: number, ouputId: string, recordColumn: string, routeIdColumn: string, callback: (path: string, prj: string) => void, errCallback: (err: Error) => void) {
        let dataset = gdal.open(inputPath);
        let layercount = dataset.layers.count();
        let layer = dataset.layers.get(0);
        let output = fs.createWriteStream(outputPath, {flags: "a"});
        output.on("finish", () => {
            callback(outputPath, layer.srs.toProj4());
        });
        layer.features.forEach((feature, featureId) => {
            let geom;
            let recordId: number = null;
            let routeId: string = null;
            try {
                let geomWKB = feature.getGeometry().toWKB();
                let geomWKBString = geomWKB.toString("hex");
                geom = geomWKBString;
                if (recordColumn) {
                    recordId = feature.fields.get(recordColumn);
                }
                if (routeIdColumn) {
                    routeId = feature.fields.get(routeIdColumn);
                }
            }
            catch (err) {
                console.log(err);
            }
            let attributes = {};

            let attributesOrdered = {};

            feature.fields.forEach((value, field) => {
                if (field != recordColumn && field != routeIdColumn) {
                    attributes[field] = value;
                }
            });

            Object.keys(attributes).sort().forEach((key) => {
                attributesOrdered[key] = attributes[key];
            });
            output.write(`${ouputId}\t${i.toString()}\t${recordId ? recordId : (featureId + 1).toString()}\t${routeId}\tnull\tnull\tnull\tnull\tnull\t${geom}\t${JSON.stringify(attributesOrdered)}\n`);
        });
        output.end();
    }
}

1 个答案:

答案 0 :(得分:1)

如果服务器没有发送响应并且浏览器达到其超时值,则浏览器会重试某些请求。每个浏览器可能配置了自己的超时,但2分钟听起来可能是浏览器超时。

您无法从服务器控制浏览器的超时。两分钟太长,不能让它等待。您需要一种能够更快响应的不同设计,然后在准备好后再将最终结果传回。使用webSocket / socket.io进行客户端轮询或服务器推送。

对于客户端轮询,您可以让服务器立即响应您的第一个请求并返回一个令牌(一些唯一的字符串)。然后,客户端可以每分钟向服务器询问该令牌的响应,直到服务器最终得到响应。如果服务器还没有响应,它只会立即返回一个代码,这意味着没有响应。如果是这样,客户端设置一个计时器并在一分钟后再次尝试,每次都发送令牌,以便服务器知道它询问的请求。

对于服务器推送,客户端创建与服务器的持久webSocket或socket.io连接。当客户端发出长时间运行请求时,服务器立即返回上述相同类型的令牌。然后,当服务器完成请求时,它通过socket.io连接发送令牌和最终数据。客户端正在侦听该socket.io连接上的传入消息,并将在那里收到最终响应。