Untitled

 avatar
unknown
plain_text
4 years ago
21 kB
5
Indexable
C:\Users\Felix\PycharmProjects\pythonProject\venv\Scripts\python.exe "C:\Program Files\JetBrains\PyCharm 2021.2.1\plugins\python\helpers\pydev\pydevconsole.py" --mode=client --port=64936
import sys; print('Python %s on %s' % (sys.version, sys.platform))
sys.path.extend(['C:\\Users\\Felix\\PycharmProjects\\machine-learning2', 'C:/Users/Felix/PycharmProjects/machine-learning2'])
PyDev console: starting.
Python 3.8.10 (tags/v3.8.10:3d8993a, May  3 2021, 11:48:03) [MSC v.1928 64 bit (AMD64)] on win32
runfile('C:/Users/Felix/PycharmProjects/machine-learning2/main.py', wdir='C:/Users/Felix/PycharmProjects/machine-learning2')
C:\Users\Felix\PycharmProjects\machine-learning2
Cross validation fold 1/10: has the lambda value 1e1.0
         Offset             0.0
      std_hydro 6.493662708593406e+29
       std_wind 2.593454102305807e+161
      std_solar 3.7559734772364737e+199
    std_biomass             0.0
    std_nuclear             0.0
   std_biofuels             0.0
Cross validation fold 2/10: has the lambda value 1e1.0
         Offset             0.0
      std_hydro 6.493662708593406e+29
       std_wind 2.593454102305807e+161
      std_solar 3.7559734772364737e+199
    std_biomass             0.0
    std_nuclear             0.0
   std_biofuels             0.0
Cross validation fold 3/10: has the lambda value 1e1.0
         Offset             0.0
      std_hydro 6.493662708593406e+29
       std_wind 2.593454102305807e+161
      std_solar 3.7559734772364737e+199
    std_biomass             0.0
    std_nuclear             0.0
   std_biofuels             0.0
Cross validation fold 4/10: has the lambda value 1e2.0
         Offset             0.0
      std_hydro 6.493662708593406e+29
       std_wind 2.593454102305807e+161
      std_solar 3.7559734772364737e+199
    std_biomass             0.0
    std_nuclear             0.0
   std_biofuels             0.0
Cross validation fold 5/10: has the lambda value 1e1.0
         Offset             0.0
      std_hydro 6.493662708593406e+29
       std_wind 2.593454102305807e+161
      std_solar 3.7559734772364737e+199
    std_biomass             0.0
    std_nuclear             0.0
   std_biofuels             0.0
Cross validation fold 6/10: has the lambda value 1e1.0
         Offset             0.0
      std_hydro 6.493662708593406e+29
       std_wind 2.593454102305807e+161
      std_solar 3.7559734772364737e+199
    std_biomass             0.0
    std_nuclear             0.0
   std_biofuels             0.0
Cross validation fold 7/10: has the lambda value 1e1.0
         Offset             0.0
      std_hydro 6.493662708593406e+29
       std_wind 2.593454102305807e+161
      std_solar 3.7559734772364737e+199
    std_biomass             0.0
    std_nuclear             0.0
   std_biofuels             0.0
Cross validation fold 8/10: has the lambda value 1e1.0
         Offset             0.0
      std_hydro 6.493662708593406e+29
       std_wind 2.593454102305807e+161
      std_solar 3.7559734772364737e+199
    std_biomass             0.0
    std_nuclear             0.0
   std_biofuels             0.0
Cross validation fold 9/10: has the lambda value 1e2.0
         Offset             0.0
      std_hydro 6.493662708593406e+29
       std_wind 2.593454102305807e+161
      std_solar 3.7559734772364737e+199
    std_biomass             0.0
    std_nuclear             0.0
   std_biofuels             0.0
Cross validation fold 10/10: has the lambda value 1e1.0
         Offset             0.0
      std_hydro            0.44
       std_wind            0.25
      std_solar           -0.16
    std_biomass            0.33
    std_nuclear            0.32
   std_biofuels           -0.22
Linear regression without feature selection:
- Training error: 0.2233985382509828
- Test error:     0.23005267968225426
- R^2 train:     0.7765406682799204
- R^2 test:     0.7694048308045904
Regularized linear regression:
- Training error: 0.2235752733396504
- Test error:     0.229970193383863
- R^2 train:     0.7763638850963134
- R^2 test:     0.7694875116147427
Weights in last fold:
         Offset             0.0
      std_hydro            0.44
       std_wind            0.25
      std_solar           -0.16
    std_biomass            0.33
    std_nuclear            0.32
   std_biofuels           -0.22
	Replicate: 1/1
		Iter	Loss			Rel. loss
C:\Users\Felix\PycharmProjects\pythonProject\venv\lib\site-packages\torch\nn\modules\loss.py:520: UserWarning: Using a target size (torch.Size([3291])) that is different to the input size (torch.Size([3291, 1])). This will likely lead to incorrect results due to broadcasting. Please ensure they have the same size.
  return F.mse_loss(input, target, reduction=self.reduction)
		Final loss:
		994	0.98524773	9.679529e-07
	Best loss: 0.9852477312088013
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0212284	4.7390648e-05
		2000	0.9976872	6.6314087e-06
		3000	0.99380046	2.2791007e-06
		Final loss:
		3507	0.9929806	9.0038907e-07
	Best loss: 0.9929805994033813
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9790843	1.4732246e-05
		2000	0.9710726	5.094532e-06
		3000	0.968206	1.1081138e-06
		Final loss:
		3051	0.9681487	9.850485e-07
	Best loss: 0.9681487083435059
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		709	0.9056353	9.872283e-07
	Best loss: 0.9056352972984314
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9241547	2.708841e-06
		Final loss:
		1043	0.9240902	9.675125e-07
	Best loss: 0.9240902066230774
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		847	0.95595247	9.3526495e-07
	Best loss: 0.9559524655342102
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.002736	2.7461456e-05
		2000	0.9833151	3.9400256e-06
		Final loss:
		2623	0.98185384	9.712987e-07
	Best loss: 0.9818538427352905
	Replicate: 1/1
		Iter	Loss			Rel. loss
C:\Users\Felix\PycharmProjects\pythonProject\venv\lib\site-packages\torch\nn\modules\loss.py:520: UserWarning: Using a target size (torch.Size([3292])) that is different to the input size (torch.Size([3292, 1])). This will likely lead to incorrect results due to broadcasting. Please ensure they have the same size.
  return F.mse_loss(input, target, reduction=self.reduction)
		1000	0.9824225	3.8160662e-05
		Final loss:
		1827	0.97609377	9.770305e-07
	Best loss: 0.9760937690734863
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0022136	1.4630143e-05
		Final loss:
		1741	0.99710554	9.564418e-07
	Best loss: 0.9971055388450623
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9810098	3.9492843e-06
		Final loss:
		1865	0.97941804	9.737144e-07
	Best loss: 0.9794180393218994
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.994765	8.148825e-06
		Final loss:
		1325	0.99361044	9.598061e-07
	Best loss: 0.9936104416847229
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		630	1.0226604	9.325417e-07
	Best loss: 1.0226603746414185
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9948235	1.3780383e-06
		Final loss:
		1056	0.9947576	8.987806e-07
	Best loss: 0.9947575926780701
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0445946	1.5976794e-06
		Final loss:
		1064	1.0445077	9.130363e-07
	Best loss: 1.0445077419281006
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0431893	3.3139315e-06
		Final loss:
		1436	1.0422955	9.149742e-07
	Best loss: 1.0422954559326172
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		923	1.0638434	8.9644163e-07
	Best loss: 1.0638433694839478
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9509946	6.894325e-06
		2000	0.94780487	1.2577393e-06
		Final loss:
		2083	0.9477105	9.433987e-07
	Best loss: 0.9477105140686035
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0271865	4.4100393e-06
		Final loss:
		1244	1.0266266	9.2893896e-07
	Best loss: 1.0266265869140625
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0817457	2.5235355e-05
		Final loss:
		1652	1.0767621	9.96397e-07
	Best loss: 1.076762080192566
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		955	1.0629141	8.9722533e-07
	Best loss: 1.0629141330718994
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0561572	2.5960212e-06
		Final loss:
		1274	1.0556123	9.034315e-07
	Best loss: 1.055612325668335
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0456482	1.0260453e-06
		Final loss:
		1025	1.0456184	9.120664e-07
	Best loss: 1.0456184148788452
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		970	0.99069345	9.626322e-07
	Best loss: 0.9906934499740601
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0086472	3.7819793e-06
		2000	1.0063756	1.7768081e-06
		3000	1.0047412	1.3051127e-06
		Final loss:
		3434	1.0041937	9.4969073e-07
	Best loss: 1.004193663597107
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		796	0.92811173	8.9909895e-07
	Best loss: 0.928111732006073
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9500775	1.3802036e-06
		Final loss:
		1054	0.95001423	9.41111e-07
	Best loss: 0.9500142335891724
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		471	1.0261054	9.2941076e-07
	Best loss: 1.0261054039001465
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0379533	1.6882715e-05
		Final loss:
		1936	1.0330124	9.231965e-07
	Best loss: 1.0330123901367188
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.959688	3.0433002e-06
		Final loss:
		1490	0.9588726	9.945777e-07
	Best loss: 0.9588726162910461
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0425172	4.9169207e-06
		Final loss:
		1708	1.0408254	9.162665e-07
	Best loss: 1.040825366973877
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		189	1.0630523	8.971087e-07
	Best loss: 1.0630522966384888
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0895205	1.3567209e-05
		2000	1.0846852	1.5386285e-06
		Final loss:
		2320	1.0842094	9.895529e-07
	Best loss: 1.0842094421386719
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9953064	1.9762251e-06
		Final loss:
		1243	0.99496275	8.985953e-07
	Best loss: 0.994962751865387
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0610036	1.2359059e-06
		Final loss:
		1031	1.0609652	8.9887345e-07
	Best loss: 1.0609651803970337
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0817248	3.3060785e-06
		Final loss:
		1474	1.0806589	9.928041e-07
	Best loss: 1.0806589126586914
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0997015	1.842822e-06
		Final loss:
		1076	1.0995859	9.757151e-07
	Best loss: 1.0995858907699585
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.96752995	3.5114704e-06
		Final loss:
		1495	0.96659315	9.2496913e-07
	Best loss: 0.9665931463241577
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0747453	4.5476436e-06
		Final loss:
		1683	1.0730083	9.998828e-07
	Best loss: 1.0730082988739014
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		855	1.0574577	9.0185495e-07
	Best loss: 1.057457685470581
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0449998	1.9392864e-06
		Final loss:
		1211	1.04467	9.1289445e-07
	Best loss: 1.0446699857711792
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		829	1.0358785	9.2064215e-07
	Best loss: 1.0358785390853882
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		499	1.0208569	9.341892e-07
	Best loss: 1.0208568572998047
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0212907	5.9528966e-06
		Final loss:
		1446	1.0197709	9.35184e-07
	Best loss: 1.019770860671997
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0387529	3.7181482e-05
		2000	1.0253586	2.5577372e-06
		Final loss:
		2209	1.0249497	9.304588e-07
	Best loss: 1.0249496698379517
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.029806	8.681848e-06
		Final loss:
		1623	1.0275888	9.280691e-07
	Best loss: 1.0275888442993164
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.97429967	1.6701018e-05
		2000	0.969189	1.5989846e-06
		Final loss:
		2167	0.96897924	9.842041e-07
	Best loss: 0.9689792394638062
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0676141	6.476211e-06
		Final loss:
		1290	1.0666219	8.941064e-07
	Best loss: 1.0666218996047974
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9705255	1.7810265e-06
		Final loss:
		1130	0.97034	9.82824e-07
	Best loss: 0.9703400135040283
	Replicate: 1/1
		Iter	Loss			Rel. loss
C:\Users\Felix\PycharmProjects\pythonProject\venv\lib\site-packages\torch\nn\modules\loss.py:520: UserWarning: Using a target size (torch.Size([3293])) that is different to the input size (torch.Size([3293, 1])). This will likely lead to incorrect results due to broadcasting. Please ensure they have the same size.
  return F.mse_loss(input, target, reduction=self.reduction)
		Final loss:
		415	1.0525274	9.0607944e-07
	Best loss: 1.0525274276733398
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0144153	3.877989e-06
		Final loss:
		1560	1.0131611	9.412851e-07
	Best loss: 1.013161063194275
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0150399	1.4915033e-05
		Final loss:
		1794	1.0096835	9.445271e-07
	Best loss: 1.0096834897994995
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9975023	1.2010388e-05
		Final loss:
		1925	0.9927323	9.606551e-07
	Best loss: 0.9927322864532471
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9673866	1.6019407e-05
		2000	0.96263164	1.6098767e-06
		Final loss:
		2420	0.9620764	9.293117e-07
	Best loss: 0.9620764255523682
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.063058	4.9340515e-06
		2000	1.0598615	1.7996173e-06
		Final loss:
		2504	1.0591164	9.0044256e-07
	Best loss: 1.0591163635253906
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		969	0.9920923	9.011952e-07
	Best loss: 0.9920923113822937
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0209599	1.2843801e-06
		Final loss:
		1058	1.0208912	9.3415775e-07
	Best loss: 1.0208911895751953
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.009589	4.13268e-06
		Final loss:
		1251	1.0090244	9.4514405e-07
	Best loss: 1.0090243816375732
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9344	2.226194e-05
		Final loss:
		1922	0.92891306	9.624893e-07
	Best loss: 0.9289130568504333
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0314684	9.823559e-06
		Final loss:
		1551	1.0291407	9.266696e-07
	Best loss: 1.0291407108306885
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.050801	1.4747973e-06
		Final loss:
		1051	1.0507354	9.076248e-07
	Best loss: 1.050735354423523
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		771	1.03955	9.173907e-07
	Best loss: 1.0395499467849731
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0589944	5.178119e-06
		Final loss:
		1444	1.0577877	9.015736e-07
	Best loss: 1.0577876567840576
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0620677	1.6836368e-06
		Final loss:
		1278	1.0616021	8.983342e-07
	Best loss: 1.0616021156311035
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		570	0.9753868	9.1663003e-07
	Best loss: 0.9753867983818054
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0271652	2.4371825e-06
		Final loss:
		1280	1.0266869	9.288844e-07
	Best loss: 1.0266869068145752
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.92609787	2.2976372e-05
		Final loss:
		1999	0.9214084	9.703284e-07
	Best loss: 0.9214084148406982
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0081984	1.5607426e-05
		Final loss:
		1828	1.0053587	9.485902e-07
	Best loss: 1.0053586959838867
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0467818	4.327486e-06
		Final loss:
		1413	1.045934	9.117913e-07
	Best loss: 1.0459339618682861
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		976	0.98493606	8.472269e-07
	Best loss: 0.9849360585212708
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		790	1.0035611	9.502893e-07
	Best loss: 1.0035611391067505
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		513	1.004001	9.4987297e-07
	Best loss: 1.0040010213851929
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9855558	1.6329088e-06
		Final loss:
		1398	0.98505986	9.681376e-07
	Best loss: 0.9850598573684692
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9984879	5.0740414e-06
		Final loss:
		1637	0.9968594	8.9688564e-07
	Best loss: 0.9968593716621399
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0048	1.0914745e-05
		Final loss:
		1951	1.0010853	9.5263954e-07
	Best loss: 1.0010852813720703
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9955151	3.5325045e-06
		Final loss:
		1138	0.9952203	8.9836277e-07
	Best loss: 0.9952203035354614
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9933245	9.840757e-06
		2000	0.98969436	1.2045047e-06
		Final loss:
		2059	0.9896258	9.0344133e-07
	Best loss: 0.9896258115768433
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9809536	6.8660524e-06
		Final loss:
		1413	0.9797938	9.733409e-07
	Best loss: 0.9797937870025635
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0231436	7.689783e-06
		Final loss:
		1962	1.0199977	9.34976e-07
	Best loss: 1.0199977159500122
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.012166	2.9207698e-05
		2000	1.0023984	2.1406286e-06
		Final loss:
		2281	1.0019377	9.51829e-07
	Best loss: 1.001937747001648
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.97911423	2.9220437e-06
		Final loss:
		1860	0.9756964	9.163392e-07
	Best loss: 0.9756963849067688
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		966	0.8937203	9.3369727e-07
	Best loss: 0.8937203288078308
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9320393	7.2903363e-06
		Final loss:
		1880	0.9293903	9.619949e-07
	Best loss: 0.9293903112411499
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9174606	7.113378e-05
		2000	0.90512586	3.6877168e-06
		3000	0.90291095	1.5183167e-06
		Final loss:
		3394	0.9024165	9.907495e-07
	Best loss: 0.9024165272712708
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		718	0.8903369	9.3724555e-07
	Best loss: 0.8903368711471558
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0024183	1.0702839e-05
		2000	0.99363106	3.7791476e-06
		3000	0.990593	2.406821e-06
		4000	0.9888974	1.1452017e-06
		Final loss:
		4131	0.98875123	9.0424044e-07
	Best loss: 0.988751232624054
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9153418	4.890075e-05
		Final loss:
		1672	0.90187746	9.913417e-07
	Best loss: 0.9018774628639221
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9050002	1.2184224e-05
		Final loss:
		1222	0.9040578	9.889509e-07
	Best loss: 0.9040578007698059
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.95098484	3.697915e-06
		Final loss:
		1398	0.9502413	9.408861e-07
	Best loss: 0.9502413272857666
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		874	0.86648226	9.630483e-07
	Best loss: 0.8664822578430176
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9189439	5.124081e-06
		Final loss:
		1771	0.9168922	9.751079e-07
	Best loss: 0.9168921709060669
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9529976	1.250886e-06
		Final loss:
		1002	0.9529957	9.381667e-07
	Best loss: 0.9529957175254822
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0276155	2.1924601e-05
		2000	1.0199648	1.8700107e-06
		Final loss:
		2213	1.019659	9.352866e-07
	Best loss: 1.0196590423583984
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		514	1.0479833	9.100083e-07
	Best loss: 1.0479832887649536
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0320913	1.1550253e-06
		Final loss:
		1023	1.0320637	9.240451e-07
	Best loss: 1.0320637226104736
	Replicate: 1/1
		Iter	Loss			Rel. loss
		Final loss:
		540	0.98741287	9.658304e-07
	Best loss: 0.9874128699302673
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9617911	4.2141155e-06
		Final loss:
		1710	0.96022373	9.311047e-07
	Best loss: 0.9602237343788147
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.98111516	1.0631476e-05
		Final loss:
		1539	0.9783219	9.748053e-07
	Best loss: 0.97832190990448
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0804278	7.7669996e-05
		2000	1.0482185	4.5490037e-06
		Final loss:
		2718	1.0468409	9.1100134e-07
	Best loss: 1.0468409061431885
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	0.9759562	3.5422256e-06
		Final loss:
		1369	0.9752281	9.778978e-07
	Best loss: 0.9752280712127686
	Replicate: 1/1
		Iter	Loss			Rel. loss
		1000	1.0555327	4.743355e-06
		Final loss:
		1475	1.0542041	9.0463834e-07
	Best loss: 1.0542041063308716
Editor is loading...