javascript - Node.js, the pyramid of doom (even with async), can you write it better? - Stack Overflow

I consider myself a very experienced node.js developer.Yet I still wonder if there is a better way to w

I consider myself a very experienced node.js developer.

Yet I still wonder if there is a better way to write the following code so I don't get the pyramid of doom... Now I went easy on you, I have some code that my pyramid gets as high as 20 floors, no kidding; and that's WITH using async.js !!!

The problem is really that I have many dependencies on previews variables so everything must be nested. The guy that wrote the book "Async Javascript, build more responsive Apps with less code" explains that he would put the functions at the root scope, which sure, would get rid of the pyramid, but now you would have a whole bunch of high scope variables (possibly even global, depending at the scope you declare them at) and this pollution can result in some pretty nasty bugs (this could cause var conflicts with other scripts if set at global space (sure you could use self invoking functions, more yachhh... or even worse, since we are dealing with async, variable overrides...). In fact, the beauty of closure is pretty mush out the door.

What he remend is doing something like:

function checkPassword(username, passwordGuess, callback) {
    var passwordHash;
    var queryStr = 'SELECT * FROM user WHERE username = ?';
    db.query(selectUser, username, queryCallback);
    function queryCallback(err, result) {
        if (err) throw err;
        passwordHash = result['password_hash'];
        hash(passwordGuess, hashCallback);
    }

    function hashCallback(passwordGuessHash) {
        callback(passwordHash === passwordGuessHash);
    }
}

again, not a clean approach IMHO.

So, if you look at my code (again, this is just a snippet, I get much bigger nests in other places) you will often see my code getting further and further apart from the left; and that's with using things like waterfall and async forEach...

here is a small example:

ms.async.eachSeries(arrWords, function (key, asyncCallback) {
    pg.connect(pgconn.dbserver('galaxy'), function (err, pgClient, pgCB) {
        statement = "SELECT * FROM localization_strings WHERE local_id = 10 AND string_key = '" + key[0] + "'";
        pgClient.query(statement, function (err, result) {
            if (pgconn.handleError(err, pgCB, pgClient)) return;
            // if key doesn't exist go ahead and insert it
            if (result.rows.length == 0) {
                statement = "SELECT nextval('last_resource_bundle_string_id')";
                pgClient.query(statement, function (err, result) {
                    if (pgconn.handleError(err, pgCB, pgClient)) return;
                    var insertIdOffset = parseInt(result.rows[0].nextval);
                    statement = "INSERT INTO localization_strings (resource_bundle_string_id, string_key, string_revision, string_text,modified_date,local_id, bundle_id) VALUES ";
                    statement += "  (" + insertIdOffset + ",'" + key[0] + "'," + 0 + ",'" + englishDictionary[key[0]] + "'," + 0 + ",10,20)";
                    ms.log(statement);
                    pgClient.query(statement, function (err, result) {
                        if (pgconn.handleError(err, pgCB, pgClient)) return;
                        pgCB();
                        asyncCallback();
                    });
                });
            }
            pgCB();
            asyncCallback();
        });
    });
});

On my deep scripts I counted over 25 closing parenthesis, CRAZY, and all while remembering where to call my last callBack so async continues to next iteration...

Is there a solution to this problem? Or it is just the natrure of the beast?

I consider myself a very experienced node.js developer.

Yet I still wonder if there is a better way to write the following code so I don't get the pyramid of doom... Now I went easy on you, I have some code that my pyramid gets as high as 20 floors, no kidding; and that's WITH using async.js !!!

The problem is really that I have many dependencies on previews variables so everything must be nested. The guy that wrote the book "Async Javascript, build more responsive Apps with less code" explains that he would put the functions at the root scope, which sure, would get rid of the pyramid, but now you would have a whole bunch of high scope variables (possibly even global, depending at the scope you declare them at) and this pollution can result in some pretty nasty bugs (this could cause var conflicts with other scripts if set at global space (sure you could use self invoking functions, more yachhh... or even worse, since we are dealing with async, variable overrides...). In fact, the beauty of closure is pretty mush out the door.

What he remend is doing something like:

function checkPassword(username, passwordGuess, callback) {
    var passwordHash;
    var queryStr = 'SELECT * FROM user WHERE username = ?';
    db.query(selectUser, username, queryCallback);
    function queryCallback(err, result) {
        if (err) throw err;
        passwordHash = result['password_hash'];
        hash(passwordGuess, hashCallback);
    }

    function hashCallback(passwordGuessHash) {
        callback(passwordHash === passwordGuessHash);
    }
}

again, not a clean approach IMHO.

So, if you look at my code (again, this is just a snippet, I get much bigger nests in other places) you will often see my code getting further and further apart from the left; and that's with using things like waterfall and async forEach...

here is a small example:

ms.async.eachSeries(arrWords, function (key, asyncCallback) {
    pg.connect(pgconn.dbserver('galaxy'), function (err, pgClient, pgCB) {
        statement = "SELECT * FROM localization_strings WHERE local_id = 10 AND string_key = '" + key[0] + "'";
        pgClient.query(statement, function (err, result) {
            if (pgconn.handleError(err, pgCB, pgClient)) return;
            // if key doesn't exist go ahead and insert it
            if (result.rows.length == 0) {
                statement = "SELECT nextval('last_resource_bundle_string_id')";
                pgClient.query(statement, function (err, result) {
                    if (pgconn.handleError(err, pgCB, pgClient)) return;
                    var insertIdOffset = parseInt(result.rows[0].nextval);
                    statement = "INSERT INTO localization_strings (resource_bundle_string_id, string_key, string_revision, string_text,modified_date,local_id, bundle_id) VALUES ";
                    statement += "  (" + insertIdOffset + ",'" + key[0] + "'," + 0 + ",'" + englishDictionary[key[0]] + "'," + 0 + ",10,20)";
                    ms.log(statement);
                    pgClient.query(statement, function (err, result) {
                        if (pgconn.handleError(err, pgCB, pgClient)) return;
                        pgCB();
                        asyncCallback();
                    });
                });
            }
            pgCB();
            asyncCallback();
        });
    });
});

On my deep scripts I counted over 25 closing parenthesis, CRAZY, and all while remembering where to call my last callBack so async continues to next iteration...

Is there a solution to this problem? Or it is just the natrure of the beast?

Share Improve this question edited Aug 29, 2014 at 20:51 Jordan Running 106k18 gold badges188 silver badges187 bronze badges asked Aug 29, 2014 at 20:34 born2netborn2net 25k22 gold badges72 silver badges105 bronze badges 5
  • 2 Note that you shouldn't use passwordHash === passwordGuessHash to pare a hashed string with a password, as this is vulnerable to timing attacks. Your crypto hash library (bcrypt) should have a cryptographically-safe parison function you can use instead. – Stuart P. Bentley Commented Aug 29, 2014 at 20:46
  • If you want to -estetically- avoid having a lot of nesting, you can use nimble to chain the calls. If you want to -stackframely- avoid having a lot of nesting, you can defer the next call using a timer, declaring each function. – Luis Masuelli Commented Aug 29, 2014 at 20:47
  • 2 You can use a promise-based style using something like q. – tcooc Commented Aug 29, 2014 at 20:47
  • There is no framework that I know of for node.js to get away from async... – born2net Commented Aug 29, 2014 at 20:50
  • have you tried wait.for? github./luciotato/waitfor – Lucio M. Tato Commented Sep 4, 2014 at 21:34
Add a ment  | 

5 Answers 5

Reset to default 3

As Mithon said in his answer, promises can make this code much clearer and help to reduce duplication. Let's say that you create two wrapper functions that return promises, corresponding to the two database operations you're performing, connectToDb and queryDb. Then your code can be written as something like:

ms.async.eachSeries(arrWords, function (key, asyncCallback) {
  var stepState = {};
  connectToDb('galaxy').then(function(connection) {
    // Store the connection objects in stepState
    stepState.pgClient = connection.pgClient;
    stepState.pgCB = connection.pgCB;

    // Send our first query across the connection
    var statement = "SELECT * FROM localization_strings WHERE local_id = 10 AND string_key = '" + key[0] + "'";
    return queryDb(stepState.pgClient, statement);
  }).then(function (result) {
    // If the result is empty, we need to send another 2-query sequence
    if (result.rows.length == 0) {
       var statement = "SELECT nextval('last_resource_bundle_string_id')";
       return queryDb(stepState.pgClient, statement).then(function(result) {
         var insertIdOffset = parseInt(result.rows[0].nextval);
         var statement = "INSERT INTO localization_strings (resource_bundle_string_id, string_key, string_revision, string_text,modified_date,local_id, bundle_id) VALUES ";
         statement += "  (" + insertIdOffset + ",'" + key[0] + "'," + 0 + ",'" + englishDictionary[key[0]] + "'," + 0 + ",10,20)";
         ms.log(statement);
         return queryDb(stepState.pgClient, statement);
       });
     }
  }).then(function (result) {
    // Continue to the next step
    stepState.pgCB();
    asyncCallback();
  }).fail(function (error) {
    // Handle a database error from any operation in this step...
  });
});

It's still plex, but the plexity is more manageable. Adding a new database operation to every "step" no longer requires a new level of indentation. Also notice that all error handling is done in one place, rather than having to add an if (pgconn.handleError(...)) line every time you perform a database operation.

Update: As requested, here's how you might go about defining the two wrapper functions. I'll assume that you're using kriskowal/q as your promise library:

function connectToDb(dbName) {
  var deferred = Q.defer();
  pg.connect(pgconn.dbserver(dbName), function (err, pgClient, pgCB) {
    if (err) {
      deferred.reject(err)
    } else {
      deferred.resolve({pgClient: pgClient, pgCB: pgCB})
    }
  });
  return deferred.promise;
}

You can use this pattern to create a wrapper around any function that takes a single-use callback.

The queryDb is even more straightforward because its callback gives you either a single error value or a single result value, which means that you can use q's built-in makeNodeResolver utility method to resolve or reject the deferred:

function queryDb(pgClient, statement) {
  var deferred = Q.defer();
  pgClient.query(statement, deferred.makeNodeResolver());
  return deferred.promise;
}

For more information on promises, check out my book: Async JavaScript, published by PragProg.

The problem to this sort of thing is promises. If you haven't heard of them, I suggest reading up on kriskowal's q.

Now, I don't know if the db.query returns a promise or not. If it doesn't you might be able to find a db-wrapper that does or a different db library. If that is not an option, you may "promisify" the db-library you're using. See Howto use promises with Node, and especially the section "Wrapping a function that takes a Node-style callback".

Best of luck! :)

The simplest way to bat the async pyramid of hell is to segregate your async callbacks into smaller functions that you can place outside your main loop. Chances are you can at least break some of your callbacks into more maintainable functions that can be used elsewhere in your code base, but the question you're asking is a bit vague and can be solved in a large number of ways.

Also, you should consider what Stuart mentioned in his answer and try to bine some of your queries together. I'm more concerned that you have 20+ nested calls which would indicate something seriously erroneous in your callback structure so I'd look at your code first before anything else.

Consider rewriting your code to have less back-and-forth with the database. The rule of thumb I use to estimate an app's performance under heavy load is that every async call will add two seconds to the response (one for the request, and one for the reply).

For example, is there maybe a way you could offload this logic to the database? Or a way to "SELECT nextval('last_resource_bundle_string_id')" at the same time as you "SELECT * FROM localization_strings WHERE local_id = 10 AND string_key = '" + key[0] + "'" (perhaps a stored procedure)?

I break each level of the pyramid of doom into a function and chain them one to the other. I think it is a lot easier to follow. In the example above i'd do it as follows.

ms.async.eachSeries(arrWords, function (key, asyncCallback) {

  var pgCB;
  var pgClient;

  var connect = function () {
    pg.connect(pgconn.dbserver('galaxy'), function (err, _pgClient, _pgCB) {
      pgClient = _pgClient;
      pgCB = _pgCB;
      findKey();
    });
  };

  var findKey = function () {
    statement = "SELECT * FROM localization_strings WHERE local_id = 10 AND string_key = '" + key[0] + "'";
    pgClient.query(statement, function (err, result) {
      if (pgconn.handleError(err, pgCB, pgClient))
        return;
      // if key doesn't exist go ahead and insert it
      if (result.rows.length == 0) {
        getId();
        return;
      }
      pgCB(); 
      asyncCallback(); 
    });

  };

  var getId = function () {
    statement = "SELECT nextval('last_resource_bundle_string_id')";
    pgClient.query(statement, function (err, result) {
      if (pgconn.handleError(err, pgCB, pgClient))
        return;
      insertKey();
    });
  };

  var insertKey = function () {
    var insertIdOffset = parseInt(result.rows[0].nextval);
    statement = "INSERT INTO localization_strings (resource_bundle_string_id, string_key, string_revision, string_text,modified_date,local_id, bundle_id) VALUES ";
    statement += "  (" + insertIdOffset + ",'" + key[0] + "'," + 0 + ",'" + englishDictionary[key[0]] + "'," + 0 + ",10,20)";
    ms.log(statement);
    pgClient.query(statement, function (err, result) {
      if (pgconn.handleError(err, pgCB, pgClient))
        return;
      pgCB();
      asyncCallback();
    });
  };

  connect();

});

发布者:admin,转转请注明出处:http://www.yc00.com/questions/1744774467a4592960.html

相关推荐

  • MCP实战

    前言最近热衷于找一些好玩的MCP,集成在cursor中,给大模型外挂许多有趣的功能,例如:什么是MCP?本地如何开发MCP ServerMCP实战 | cursor 如何一句话操作 gitHub 代码库cursor 如何调用 MCP ser

    1小时前
    00
  • 方法分享

    作者,Evil Genius现在的日子,关税在涨,物价在涨,钱是越来越难挣了。什么东西在跌呢?工资在跌,房价也跌,房租反而涨了,生活的压力感觉越来越大了。对于生信分析而言,什么东西是最重要的?尤其是空间转录组的分析?很显然,硬件是服务器,软

    1小时前
    00
  • 《Google Gemini 1.5 Pro:MoE架构如何重塑AI性能与效率》

    Google Gemini 1.5 Pro的诞生犹如一颗璀璨新星,吸引了无数目光。尤其是其采用的混合专家系统(MoE)架构扩展,为模型性能与推理效率带来了革命性的提升,在AI领域掀起了一阵讨论热潮。传统的人工智能模型就像是一个全能型选手,无

    57分钟前
    00
  • Elasticsearch BBQ与OpenSearch FAISS:向量搜索性能对比

    基于二进制量化的向量搜索:Elasticsearch使用BBQ技术比OpenSearch搭配FAISS快5倍。我们收到社区要求,希望能解释Elasticsearch与OpenSearch在语义搜索向量搜索方面的性能差异。因此,我们进行了一

    55分钟前
    00
  • Power BI Vega 条形纵向折线组合图表

    条形图和纵向折线图组合常用来展示绝对值和率值的组合。之前在知识星球分享过SVG的版本(使用内置表格加载),今天换一种实现方式,使用Deneb视觉对象:把需要展示的维度和指标拖拽到Deneb,本例维度为城市,绝对值为[kpi]度量值,率值为[

    53分钟前
    00
  • 你的服务器够快吗?一文说清如何衡量服务器的性能?

    本文以一个标准的 Seurat 单细胞分析流程为例,深入探讨衡量计算性能的关键指标——运行时间(Elapsed Time)、CPU 时间(CPU Time)和等待时间(Wait Time)——揭示它们在单线程与多线程环境下的差异及其反映的系

    50分钟前
    00
  • 苹果电脑装win7系统问题

    安装中问题: Windows 未能启动。原因可能是最近更改了硬件或软件。解决此问题的步骤: 1. 插入Windows 安装光盘并重新启动计算机。 2. 选择语言设置,然后单击

    47分钟前
    00
  • 在win7系统电脑安装node16的版本(已成功安装运行)

    很多银行的项目行方都要求内网开发,但是我遇到的几个银行基本都是win7系统的电脑,而前端的项目又是需要高版本的node才能跑起来,所有就记录此解决方案文章&#xff0

    42分钟前
    00
  • 皮尔兹Pnoz c1c2安全继电器配置与接线

    PNOZ compact是皮尔兹的一款经济型的安全继电器系列,价格实惠,空间紧凑,适用于安装急停按钮、安全门或光障光栅 等安全产品。PNOZ C1是适用急停按钮或安全门开关。PNOZ C2用于 4 类光障或带 OSSD 输出的传感器的安全

    38分钟前
    00
  • 计算机无法查看图片,Win7系统提示"windows照片查看器无法打开此图片"的解决方法...

    Win7系统经常会用Windows照片查看器查看图片,但是有位用户查看图片的时候出现提示“windows照片查看器无法打开此图片”,为什么会出现这样的问题呢?这是因为此文件可能已损坏、损毁或过大。

    37分钟前
    00
  • 飞书多维表格批量转换

    之前一直以为,飞书多维表格是专门用来协作的,除了漂亮一点也没啥。近期偶然得知飞书多维表格已经新增了 AI 的加持,变得不得了了,今天就来分享一个场景:不同数据库 SQL 格式的批量转换。假设你现在有一批 Oracle 脚本,现在因为某种原因

    36分钟前
    00
  • Sentinel 的熔断和限流

    在分布式系统里,服务之间牵一发而动全身,一个接口雪崩,可能带崩整个应用链路。要想系统抗住流量洪峰,顶住突发异常,就得在稳定性上下功夫。今天我就来说说稳定性保障里的老将——Sentinel,看看它是怎么凭借限流熔断,在服务治理的江湖里占得一席

    31分钟前
    00
  • OceanBase 单机版可以大批量快速部署吗? YES

    OceanBase的单机版已经开放给一些老师测试了,有幸被邀请进行单机测试,这次可以测试的是商业版,我已经好久没有尝试一个商业版线下的数据库试用了,上一次还是SQL SERVER 2019 的180天版本。测试企业OB版本测试企业OB版本目

    28分钟前
    00
  • windows7计算机图片,win7照片查看器无法显示图片计算机可用内存不足 需要技巧...

    很多人在使用win7图片查看器的时候,查看图片都有遇到内存不足的情况,却又不知道是什么原因,该怎么处理,那么今天我们就一起来看看吧。 工具材料 ●电脑

    26分钟前
    20
  • 得物增长兑换商城的构架演进

    一、简介在移动互联网蓬勃发展的今天,用户的选择日益多元化,App市场的竞争也愈发白热化。为了在激烈的市场竞争中脱颖而出,提升用户获取效率并增强用户粘性,越来越多的应用开始采用积分兑换、抽奖等互动玩法。这些精心设计的运营策略不仅能够满足用户的

    20分钟前
    20
  • goframe这个劲爆的脚手架做项目还是非常不错

    goframe一个由国人封装的go语言框架,模块化的设计理念非常不错。要学习goframe框架内的相关技术,找一个成熟的系统进行研读是入门提高的不错选择。今天分享一个基于goframe框架的开发的后台管理系统。功能非常不错,而且前后端代码都

    18分钟前
    00
  • OFC 2025 TeraHop报告:AI数据中心光连接技术

    在人工智能技术快速发展的背景下,AI数据中心的建设对光模块技术提出了更高要求。TeraHop在OFC 2025会议上的报告,深入探讨了这一领域的现状与未来。一、AI数据中心光互连面临的挑战(一)传输速率升级加速过去光

    13分钟前
    00
  • OFC 2025 Google报告:AI时代的光通信挑战

    在OFC会议的数据中心峰会上,谷歌带来了一场深度分享,直指AI和ML时代光通信技术面临的关键问题。报告中,谷歌明确指出可靠性与可用性、功耗、带宽规模三大“潜在砖墙”,并结合实际案例与数据,剖析挑战本质及解决思路 。一、AI时代的

    11分钟前
    00
  • OFC 2025:IBM开发应用于CPO的高密度聚合物波导光子模块

    一、技术背景:突破电互连瓶颈在服务器硬件中,计算单元的带宽每两年以约 3 倍的速度增长,而互连技术仅以 1.4 倍的速度发展,导致 “带宽鸿沟” 持续扩大。传统可插拔收发模块虽能扩展传输距离,但在高密度集成和功耗上存在局限。

    8分钟前
    00
  • 亲测!Gemini一句话复现顶刊论文图片,碾压ChatGPTGrok 3

    论文图片复现是新手小白学习过程中的必经之路,但在实际操作中时常面临诸多挑战。随着人工智能技术的进步,我们现可借助AI模型来复现论文图片,从而更深入地理解原文逻辑和研究方法。然而,不同AI模型在论文图像复现能力上存在明显差异。通过对Gemi

    1分钟前
    00

发表回复

评论列表(0条)

  • 暂无评论

联系我们

400-800-8888

在线咨询: QQ交谈

邮件:admin@example.com

工作时间:周一至周五,9:30-18:30,节假日休息

关注微信