R和NLS:初始参数处的奇异梯度矩阵

时间:2015-12-10 12:01:16

标签: r nls

我试图使用nls来估计非线性模型的参数。

我首先使用nls2通过随机搜索找到良好的初始参数,然后我使用nls来改进使用Gauss-Newton方法的估算。

问题是我总是在初始参数估计时得到一个奇异的梯度矩阵"错误。

我不确定我理解,因为输入矩阵似乎不是一个奇异的梯度矩阵。

此外,即使我正在寻找的这种数据并不适合这些数据,nls应该找到一种方法来改善 参数估计。不是吗?

问题:有没有办法改进参数估算?

我已经尝试过NLS.lm,但我遇到了同样的问题。

这是一个可复制的例子:

数据:

structure(list(x1 = c(0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 
0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 
0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 
0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 
0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 3L, 3L, 
3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 
3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 
3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 
3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 4L, 
4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 
4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 
4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 
4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 5L, 
5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 
5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 
5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 
5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 6L, 6L, 
6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 
6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 
6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 
6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 7L, 7L, 7L, 7L, 
7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 
7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 
7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L), x2 = c(1L, 2L, 3L, 4L, 
5L, 6L, 7L, 8L, 9L, 10L, 11L, 12L, 13L, 14L, 15L, 16L, 17L, 18L, 
19L, 20L, 21L, 22L, 23L, 24L, 25L, 26L, 27L, 28L, 29L, 30L, 31L, 
32L, 33L, 34L, 35L, 36L, 37L, 38L, 39L, 40L, 41L, 42L, 43L, 44L, 
45L, 46L, 47L, 48L, 49L, 50L, 51L, 52L, 53L, 54L, 55L, 56L, 57L, 
58L, 59L, 60L, 61L, 62L, 63L, 64L, 65L, 66L, 67L, 0L, 1L, 2L, 
3L, 4L, 5L, 6L, 7L, 8L, 9L, 10L, 11L, 12L, 13L, 14L, 15L, 16L, 
17L, 18L, 19L, 20L, 21L, 22L, 23L, 24L, 25L, 26L, 27L, 28L, 29L, 
30L, 31L, 32L, 33L, 34L, 35L, 36L, 37L, 38L, 39L, 40L, 41L, 42L, 
43L, 44L, 45L, 46L, 47L, 48L, 49L, 50L, 51L, 52L, 53L, 54L, 55L, 
56L, 57L, 58L, 59L, 60L, 61L, 62L, 63L, 64L, 65L, 66L, 0L, 1L, 
2L, 3L, 4L, 5L, 6L, 7L, 8L, 9L, 10L, 11L, 12L, 13L, 14L, 15L, 
16L, 17L, 18L, 19L, 20L, 21L, 22L, 23L, 24L, 25L, 26L, 27L, 28L, 
29L, 30L, 31L, 32L, 33L, 34L, 35L, 36L, 37L, 38L, 39L, 40L, 41L, 
42L, 43L, 44L, 45L, 46L, 47L, 48L, 49L, 50L, 51L, 52L, 53L, 54L, 
55L, 56L, 57L, 58L, 59L, 60L, 61L, 62L, 63L, 64L, 65L, 0L, 1L, 
2L, 3L, 4L, 5L, 6L, 7L, 8L, 9L, 10L, 11L, 12L, 13L, 14L, 15L, 
16L, 17L, 18L, 19L, 20L, 21L, 22L, 23L, 24L, 25L, 26L, 27L, 28L, 
29L, 30L, 31L, 32L, 33L, 34L, 35L, 36L, 37L, 38L, 39L, 40L, 41L, 
42L, 43L, 44L, 45L, 46L, 47L, 48L, 49L, 50L, 51L, 52L, 53L, 54L, 
55L, 56L, 57L, 58L, 59L, 60L, 61L, 62L, 63L, 64L, 0L, 1L, 2L, 
3L, 4L, 5L, 6L, 7L, 8L, 9L, 10L, 11L, 12L, 13L, 14L, 15L, 16L, 
17L, 18L, 19L, 20L, 21L, 22L, 23L, 24L, 25L, 26L, 27L, 28L, 29L, 
30L, 31L, 32L, 33L, 34L, 35L, 36L, 37L, 38L, 39L, 40L, 41L, 42L, 
43L, 44L, 45L, 46L, 47L, 48L, 49L, 50L, 51L, 52L, 53L, 54L, 55L, 
56L, 57L, 58L, 59L, 60L, 61L, 62L, 63L, 0L, 1L, 2L, 3L, 4L, 5L, 
6L, 7L, 8L, 9L, 10L, 11L, 12L, 13L, 14L, 15L, 16L, 17L, 18L, 
19L, 20L, 21L, 22L, 23L, 24L, 25L, 26L, 27L, 28L, 29L, 30L, 31L, 
32L, 33L, 34L, 35L, 36L, 37L, 38L, 39L, 40L, 41L, 42L, 43L, 44L, 
45L, 46L, 47L, 48L, 49L, 50L, 51L, 52L, 53L, 54L, 55L, 56L, 57L, 
58L, 59L, 60L, 61L, 62L, 0L, 1L, 2L, 3L, 4L, 5L, 6L, 7L, 8L, 
9L, 10L, 11L, 12L, 13L, 14L, 15L, 16L, 17L, 18L, 19L, 20L, 21L, 
22L, 23L, 24L, 25L, 26L, 27L, 28L, 29L, 30L, 31L, 32L, 33L, 34L, 
35L, 36L, 37L, 38L, 39L, 40L, 41L, 42L, 43L, 44L, 45L, 46L, 47L, 
48L, 49L, 50L, 51L, 52L, 53L, 54L, 55L, 56L, 57L, 58L, 59L, 60L, 
61L, 0L, 1L, 2L, 3L, 4L, 5L, 6L, 7L, 8L, 9L, 10L, 11L, 12L, 13L, 
14L, 15L, 16L, 17L, 18L, 19L, 20L, 21L, 22L, 23L, 24L, 25L, 26L, 
27L, 28L, 29L, 30L, 31L, 32L, 33L, 34L, 35L, 36L, 37L, 38L, 39L, 
40L, 41L, 42L, 43L, 44L, 45L), y = c(0.0689464583349188, 0.0358227182166929, 
0.0187034836294036, 0.0227081421239796, 0.0146603483536504, 0.00562771204350896, 
0.00411351161052011, 0.00356917888321555, 0.0028017552960605, 
0.0024750328652541, 0.00243175013170564, 0.00242654283706898, 
0.00235224917236107, 0.00176144220485858, 0.00138071934398105, 
0.000696375069179013, 0.00106282865382483, 0.00114735219137874, 
0.00277256441625284, 0.00214359572321392, 0.00144935953386591, 
0.00249732559162499, 0.00225859018399108, 0.00201642941663214, 
0.00232438586834105, 0.0016083751355862, 0.00143118376291818, 
0.00158323933266031, 0.00157585431454131, 0.00169206800399143, 
0.00158514119474578, 0.00134506293557103, 0.00119442163345335, 
0.00101284069499962, 0.0012621113004254, 0.00128964367655383, 
0.00102819258807122, 0.00125345601171754, 0.00116155619985178, 
0.00142466624262548, 0.00141075318725309, 0.00106556656123991, 
0.0010976347045814, 0.0012442089226047, 0.0010627617251863, 0.00125322168410487, 
0.00112108560656369, 0.0012459199320756, 0.00135773322693401, 
0.0013997982284804, 0.00155012485145915, 0.00151108062240688, 
0.00149570655260348, 0.00152598641103596, 0.00108261570337346, 
0.000992225418429453, 0.000769588971038765, 0.000700496873143604, 
0.000688378351958078, 0.000595007407260441, 0.000557615594951187, 
0.00040476923690092, 0.000492276455560289, 0.000447248723966691, 
0.000388694992851599, 0.000346087542525691, 0.000189803623801549, 
0.0709302325562937, 0.0424623423412875, 0.019085896698975, 0.0190650552541205, 
0.014276898897581, 0.00593407290200902, 0.00445528598343583, 
0.00371231334350143, 0.00253909496678967, 0.00263487912423124, 
0.00248012072619926, 0.00263786771266913, 0.00219351150766708, 
0.00179271674850348, 0.00139646119589996, 0.000911560061336614, 
0.000989537441246412, 0.001046390000492, 0.00223993432619926, 
0.00164189356162362, 0.00106041866437064, 0.00194151698794588, 
0.0014213192200082, 0.00165239495268553, 0.00196583929282493, 
0.00120501090643706, 0.001141403899631, 0.00122398595424354, 
0.00124538223829438, 0.00123370121853218, 0.00136883147552275, 
0.00110907318146781, 0.000965843164247642, 0.000859986264862649, 
0.00104695561918819, 0.00103985460139401, 0.000455832014104141, 
0.000704296760639607, 0.000870145383845838, 0.000919870911357114, 
0.00101396309667897, 0.000781894087412874, 0.000909712365723658, 
0.000889897365477655, 0.000933063039278393, 0.000779395399425994, 
0.000789546295038951, 0.000773432990897909, 0.00125614787798278, 
0.00123172652693727, 0.00078936677195572, 0.000952107503075031, 
0.00105449131480115, 0.00123128091742517, 0.000889501370397704, 
0.00085648642099221, 0.000830097733497335, 0.000653482256334563, 
0.000521696831160312, 0.000612702433456335, 0.000513576588109881, 
0.000475289330709307, 0.00041141913800738, 0.000328157997211972, 
0.00031336264403444, 0.000328784093808938, 0.000237448446412464, 
0.0520691145678866, 0.0281929482152033, 0.0219024230330532, 0.0141074098760277, 
0.00691341703402584, 0.00445785262213699, 0.0034569415664917, 
0.00234406584844369, 0.00257369504707459, 0.00234047371531346, 
0.00227286083862502, 0.00248544382019894, 0.00180810413760828, 
0.00138986347039715, 0.000911936124008956, 0.000932783218782117, 
0.00108887529088974, 0.0017855660833578, 0.00159768589505946, 
0.00124091041330201, 0.00203036436876009, 0.00154489107876964, 
0.00111687975012847, 0.00163256939968433, 0.00143626193198502, 
0.000996683818914256, 0.0010781399542101, 0.00122575793431581, 
0.00115671467616723, 0.001069532453476, 0.0010106869893371, 0.000978618104445015, 
0.000894478048836441, 0.000842874700392747, 0.000819009288742475, 
0.000843003919670386, 0.000964158733115548, 0.000877802228013507, 
0.00087592051873807, 0.000935810596369843, 0.000879047729316546, 
0.000829181439950081, 0.0010295792954412, 0.000765620227389517, 
0.00102511256239906, 0.000823109180461753, 0.00111669534392894, 
0.000802757620485245, 0.00103231207284173, 0.000884354083467919, 
0.00109278942886507, 0.000969283099489796, 0.000827480664091176, 
0.000798564447676552, 0.000909248326695786, 0.000682209033640434, 
0.000780593294853913, 0.000485172195712818, 0.000467514093470122, 
0.000295219649739392, 0.000460636351123183, 0.00045060371687344, 
0.000492590160218764, 0.000402536549331963, 0.000271941766535751, 
0.000171012123770371, 0.0267385565244063, 0.0275426278720772, 
0.0154589149018475, 0.00729065000152096, 0.00513675524527996, 
0.00378848397112206, 0.00305965140790087, 0.00240428827949139, 
0.00233604733730811, 0.00199601458903693, 0.00198302547453915, 
0.00137121122011316, 0.00126241982975401, 0.0012413298189045, 
0.00103044327584109, 0.00106759120581615, 0.00190957422380402, 
0.00124400301656831, 0.000989035353673623, 0.00160702520431547, 
0.0011515826661394, 0.00153203681379408, 0.00134897491229138, 
0.000916492937174261, 0.00072393419977287, 0.00115124473393361, 
0.00104241370079698, 0.000953324905193568, 0.00121656899373365, 
0.000891420608484922, 0.000671666092758208, 0.000659860761797571, 
0.000586145968952161, 0.00072735268499929, 0.000658407622538582, 
0.000498831767252743, 0.000658345030520574, 0.000542106922897528, 
0.000874560054044737, 0.000543320226217274, 0.000751139509440084, 
0.000668632963233356, 0.000656903021131188, 0.000574965903652329, 
0.0006661524076778, 0.000605171890653201, 0.000527045917239561, 
0.000985791370586684, 0.000899420142057553, 0.000933015548254953, 
0.00082137283567561, 0.000870124781995904, 0.000498046123582973, 
0.000540181050881142, 0.000596948101336416, 0.000405622486362069, 
0.000631594016548032, 0.000468749313033603, 0.000389576698910993, 
0.000335624642574679, 0.000286763668856847, 0.000439039581432135, 
0.000244767908276044, 0.000303911794528604, 0.000160988671898765, 
0.0365772382134747, 0.0255898183301035, 0.010327803963121, 0.00714710822108354, 
0.00506253612461807, 0.00447056668291465, 0.00322822676102386, 
0.00328154620569948, 0.0028470908747756, 0.00253477302081723, 
0.00187837758253778, 0.00116416512964702, 0.00119557763663167, 
0.000993575112051645, 0.00136274483135782, 0.00204131052512691, 
0.00157953945941769, 0.00116523253183218, 0.00190793844827791, 
0.00144595416523011, 0.00157423646879793, 0.00126996001866537, 
0.00115283860342634, 0.00116894693507543, 0.000930041619012519, 
0.00106545753272384, 0.00123507493015348, 0.00130865599847824, 
0.000940647984853709, 0.000836521897923032, 0.000778436697656724, 
0.00100773629284415, 0.000956581999215341, 0.000808036977042788, 
0.000597930101173421, 0.000776453419209873, 0.000630241947142534, 
0.000649832426616575, 0.000782188275296327, 0.00102823806308181, 
0.000830656989407107, 0.00051915559901561, 0.000537114715917872, 
0.000872430107712244, 0.000549284113632851, 0.000738257038745497, 
0.00097442578198376, 0.000879724260815807, 0.000884543540237537, 
0.00100038027474944, 0.00103543285342337, 0.000875585441608313, 
0.000829083410412184, 0.000760316116414823, 0.000712211369823927, 
0.000386744815307978, 0.000428331410721292, 0.000397681982571065, 
0.000213938551710199, 0.000370800615243779, 0.000281234314553042, 
0.000267359921177464, 0.000358376119030352, 0.000337361541022196, 
0.0310029062887812, 0.0154963087949333, 0.00959302943445506, 
0.00645674376405936, 0.00525321947702945, 0.00386084394749159, 
0.00374364242039947, 0.00351047952579374, 0.00298556939927835, 
0.00199158625919048, 0.00206559575086432, 0.00169077836254661, 
0.00139156751815451, 0.00170363478493893, 0.00250481301085496, 
0.00182474837251083, 0.00116804333227652, 0.00155778636185214, 
0.00183778204100427, 0.00135012918459471, 0.00166904872503284, 
0.00120137403943415, 0.00108307957787943, 0.00146041465872549, 
0.0014437889563235, 0.000975926161359965, 0.00102580511345623, 
0.00112145083941, 0.000921884915530595, 0.00082253191796126, 
0.000634876416504371, 0.00108601324863747, 0.000830573067167897, 
0.000965052460105379, 0.000922667052402736, 0.000863193817654785, 
0.000982111173513293, 0.000763009170856168, 0.000921755812461313, 
0.000771609983091022, 0.000669047474976222, 0.000773869648383834, 
0.00072022523061129, 0.000742426347056781, 0.000718728249316847, 
0.000761437280522971, 0.000833112611531319, 0.000794451658438637, 
0.000907360341651947, 0.00112083735676435, 0.00102996529205731, 
0.000651843453054939, 0.000640968179416338, 0.000549646466476441, 
0.000778958256714525, 0.000627413038784969, 0.000523658918731223, 
0.000418571973368359, 0.000643352520494588, 0.000351378727146459, 
0.000504093577607682, 0.000333827596358531, 0.000339505558071773, 
0.0181836504450303, 0.0135527124187004, 0.00780738765319868, 
0.00643260738080874, 0.00476881905655232, 0.00406986745617877, 
0.00400325917456592, 0.00277499160186111, 0.00198311377238581, 
0.00241837807740304, 0.00141018451525995, 0.00166798657140732, 
0.0013970042073337, 0.00237332662413329, 0.00146721126831566, 
0.000990562316636778, 0.00186106889002752, 0.00186322276224556, 
0.00140391140302307, 0.00139027556176293, 0.00125730361478641, 
0.00127044200804939, 0.00126655503830484, 0.00133956330669488, 
0.00128219844136096, 0.00109531452608613, 0.00112195611926977, 
0.00101411381866565, 0.00104786051750783, 0.000798711632769435, 
0.000852432172756047, 0.000852720107765923, 0.00110385307389073, 
0.00081385514739304, 0.00102898862672826, 0.000710330768658628, 
0.000803425598538879, 0.000723455383750816, 0.00075034248654992, 
0.000864917906994041, 0.000799733114881449, 0.000608518601191706, 
0.000855476747683942, 0.000988548021123443, 0.00104800683206201, 
0.000997051779707941, 0.000796235203259423, 0.000910577791459715, 
0.000869997383535945, 0.000557402535474327, 0.000757813148434336, 
0.000480807445269952, 0.000553425518375578, 0.000633029237291637, 
0.00050222863978579, 0.000390945889771328, 0.000430333228928208, 
0.000425167676834459, 0.000239604519722651, 0.000357021364759551, 
0.000292330910803864, 0.000288851701197491, 0.0198837196044917, 
0.0142208140311702, 0.00733039271103269, 0.00609158853724431, 
0.00487605866828399, 0.00382636157210858, 0.00411545257392807, 
0.00235906433257981, 0.00228491326937568, 0.00109255715480326, 
0.00158036861847788, 0.00122011020381908, 0.00223761733564904, 
0.00173284341769128, 0.00117538923471357, 0.00219622963095698, 
0.00214263916211795, 0.0013198229549172, 0.00172951959530242, 
0.00128074705482347, 0.00124062569884766, 0.00144218669111025, 
0.00148407512819099, 0.00100716026446858, 0.0010842890711437, 
0.000800686408079248, 0.000890454658065465, 0.000887152794471706, 
0.00105780722647994, 0.000874948318354744, 0.000569126715186268, 
0.000924642167943982, 0.000857013884141074, 0.000823122890591976, 
0.00073038777177409, 0.000522615873628494, 0.00070936497950782, 
0.000823074755104667, 0.000720588701733105, 0.000722724038337836, 
0.00063458965098969, 0.000620049346639466, 0.000842327487089008, 
0.000617708212493797, 0.000783953750160813, 0.00112567150392384
)), .Names = c("x1", "x2", "y"), class = c("tbl_df", "data.frame"
), row.names = c(NA, -500L))

初始参数: initial_par

structure(list(A1 = 0.0529486559121727, alpha1 = 0.00888818269595504, 
    B1 = 0.250994319084551, beta1 = 0.471984946168959, A2 = 0.281956987357551, 
    alpha2 = 0.325086771510541, B2 = 0.0562204262765557, beta2 = 0.725645614322275), class = "data.frame", row.names = c(NA, 
-1L), .Names = c("A1", "alpha1", "B1", "beta1", "A2", "alpha2", 
"B2", "beta2"))

公式:

formula = y ~    
  (A1*exp(-alpha1*x1) + B1*exp(-beta1*x1)) *  
  (A2*exp(-alpha2*x2) + B2*exp(-beta2*x2)) 

Nls和错误消息

final = nls(formula,
             data=df, 
             start = as.list(as.vector(initial_par)))


Error in nlsModel(formula, mf, start, wts) : 
  singular gradient matrix at initial parameter estimates

1 个答案:

答案 0 :(得分:0)

问题是模型和参数之间没有一对一的关系。要看这写A1 = exp(a1 + d),A2 = exp(a2-d),B1 = exp(b1 + d),B2 = exp(b2-d),在这种情况下我们有:

y ~ exp(-alpha1 * x1 + a1 + d) * exp(-alpha2 * x2 + a2 - d) +
    exp(-alpha1 * x1 + a1 + d) * exp(-beta2 * x2 + b2 - d) +
    exp(-beta1 * x1 + b1 + d) * exp(-alpha2 * x2 + a2 - d) +
    exp(-beta1 * x1 + b1 + d) * exp(-beta2 * x2 + b2 - d)

但是d取消了4个术语中的每个术语,因此完全取消了RHS。也就是说,RHS对于任何d值都是相同的,因此模型是过度参数化的,因此会产生单一的梯度。

修复A1,A2,B1,B2中的一个然后你应该能够得到一个解决方案:

A1 <- 1
nls(formula, df, start = initial_par[-1])

,并提供:

Nonlinear regression model
  model: y ~ (A1 * exp(-alpha1 * x1) + B1 * exp(-beta1 * x1)) * (A2 *     exp(-alpha2 * x2) + B2 * exp(-beta2 * x2))
   data: df
 alpha1      B1   beta1      A2  alpha2      B2   beta2 
0.11902 1.21030 0.79076 0.04604 0.51697 0.00183 0.02317 
 residual sum-of-squares: 0.000685

Number of iterations to convergence: 11 
Achieved convergence tolerance: 6.686e-06