Add code

parent 80277518
# Resilink Project
Welcome to the Resilink project repository! This project focuses on **resilient structural designs** to mitigate the impact of natural disasters, especially earthquakes. Below is a breakdown of the project structure and the purpose of each directory.
## Project Structure
```plaintext
RESILINK/
├── data/
├── docs/
├── env/
├── models/
├── references/
├── reports/
└── src/
```
### 1. `/data/`
This folder contains raw and processed data used in the project. It includes:
- **Raw Data**: Initial datasets collected from various sources related to the project (e.g., simulation results, earthquake data).
- **Processed Data**: Data that has been cleaned or transformed for use in models and analysis.
### 2. `/docs/`
Documentation related to the project, including:
- **Technical Documentation**: Information on methodologies, system design, and project goals.
- **User Guides**: Instructions on how to run the code, set up the environment, and contribute to the repository.
- **Papers/Reports**: Any research papers, technical reports, or white papers produced during the project.
### 3. `/env/`
Environment configuration files such as:
- **Virtual Environment Files**: Files for setting up the Python or other environments necessary for the project (e.g., `requirements.txt` or `environment.yml` for Conda).
- **Configuration Files**: Other environment-related files such as `.env` files or Docker configurations.
### 4. `/models/`
This directory contains:
- **Trained Models**: Machine learning or simulation models used for the project.
- **Model Specifications**: Documentation and files related to the architecture and training process of the models used.
- **Pre-trained Models**: Any pre-trained models used as part of the research.
### 5. `/references/`
A collection of references and research materials, including:
- **Citations**: PDFs or links to papers, books, and articles referenced during the project.
- **Standards/Regulations**: Relevant building codes, seismic standards, or other legal regulations used in the research and development phase.
### 6. `/reports/`
This directory holds reports and outputs generated by the project, including:
- **Progress Reports**: Regular updates on the project milestones.
- **Final Reports**: Consolidated results and findings of the project.
- **Visualizations/Charts**: Any figures or charts generated during the analysis.
### 7. `/src/`
The main source code for the project, organized into subfolders:
- **Algorithms**: Code for running simulations, machine learning models, or analytical algorithms developed as part of the project.
- **Utilities**: Helper scripts and functions to process data or assist with tasks.
- **Scripts**: Specific scripts to run certain experiments or analyses.
## How to Get Started
### Requirements (PENDING)
To set up the project environment:
1. Install required libraries from the `requirements.txt` or `environment.yml` file:
```bash
pip install -r requirements.txt
```
or
```bash
conda env create -f environment.yml
```
2. Clone the repository:
```bash
git clone https://github.com/yourusername/Resilink.git
```
3. Navigate to the project directory:
```bash
cd Resilink
```
### Running the Project
- To run the simulations or machine learning models, navigate to the `src/` directory and run the respective scripts.
- For data analysis or reporting, refer to the `reports/` folder where the results are stored or the `src/` folder where analysis code is located.
## Contribution Guidelines
If you'd like to contribute to the Resilink project, please follow these steps:
1. Fork the repository.
2. Create a new branch (`git checkout -b feature/new-feature`).
3. Make your changes.
4. Commit your changes (`git commit -am 'Add new feature'`).
5. Push to the branch (`git push origin feature/new-feature`).
6. Create a pull request.
## License
To be defined.
## Contact
For any questions or issues, please contact:
- **Lead Developer**: [Joaquín Irazábal González] (jirazabal@cimne.upc.edu)
- **Project Website**: [https://www.cimne.com/sgp/rtd/Project.aspx?id=10073]
\ No newline at end of file
tw1,tw2,Exymax_tw1,Exymax_tw2,Eyymax_tf,TFMmax_tw1,TFMmax_tw2,TFMmax_frame
15.22,18.46,0.0393,0.0445,0.1191,73.56,63.43,88.89
12.74,12.61,0.0401,0.0783,0.1249,73,112.4,93.35
18.03,11.3,0.0228,0.0993,0.1671,35.52,145.99,115.54
21.19,19.38,0.0201,0.0506,0.1515,40.39,70.26,102.52
9.78,15.96,0.071,0.0434,0.1168,143.78,74.4,80.59
18.64,13.8,0.0244,0.0815,0.1547,36.51,106.32,107.92
13.46,20.99,0.0536,0.0294,0.1017,105.5,49.82,83.51
8.58,8.88,0.0633,0.1043,0.1303,123.13,165.69,86.98
tw1,tw2,Exymax_tw1,Exymax_tw2,Eyymax_tf,TFMmax_tw1,TFMmax_tw2,TFMmax_frame
15.22,18.46,0.051,0.054,0.145,100.96,83.47,99.26
12.74,12.61,0.0625,0.0941,0.1555,93.41,143.03,105.15
18.03,11.3,0.0296,0.1281,0.2321,41.79,188.88,133.53
21.19,19.38,0.0349,0.0676,0.1833,50.91,88.9,118.74
9.78,15.96,0.0989,0.0507,0.1338,180.23,82.77,91.58
18.64,13.8,0.0303,0.1027,0.1914,44.33,142.86,125.65
13.46,20.99,0.0686,0.0333,0.1173,138.99,60.59,89.55
8.58,8.88,0.1085,0.1382,0.167,146.75,192.93,94.66
tw1,tw2,tw3,Exymax_tw1,Exymax_tw2,Exymax_tw3,Eyymax_tf,TFMmax_tw1,TFMmax_tw2,TFMmax_tw3,TFMmax_frame
4.31,4.2,7.17,0.081399,0.095849,0.072698,0.108977,134.548366,190.082025,108.322788,64.052035
8.9,15.25,14.89,0.031495,0.022573,0.035934,0.092647,57.393216,42.078067,63.577736,70.311308
13.66,6.66,9.52,0.00319,0.07446,0.058312,0.13204,3.396982,112.666323,93.5497,78.139829
6.1,15.97,15.39,0.063868,0.01835,0.027527,0.076159,124.897399,33.448277,53.074749,60.239866
15.13,12.97,13.51,0.004018,0.034796,0.046836,0.107686,4.025589,51.82394,71.848346,78.133494
14.48,9.59,9.39,0.002308,0.048369,0.070858,0.124101,2.033274,59.944574,93.278153,87.742995
8.1,8.81,11.46,0.027618,0.052304,0.043865,0.10637,49.122819,80.188079,68.535926,69.67043
12.35,6.07,10.61,0.006,0.083096,0.046531,0.138581,6.649448,136.799648,80.798243,84.222819
13.34,12.61,8.52,0.008128,0.025456,0.091674,0.156238,4.887164,46.362696,142.643906,111.330919
6.85,6.25,5.71,0.035243,0.066174,0.099031,0.122972,44.84485,99.562304,162.46509,85.261385
6.25,7.31,15.49,0.049554,0.062148,0.025138,0.105118,94.951211,122.343245,39.629697,74.472141
7.09,5.05,8.99,0.042398,0.089255,0.065508,0.126227,53.757512,163.051741,96.05268,77.752869
8.57,7.08,12.7,0.024167,0.07027,0.037139,0.120873,37.867938,119.727924,61.053057,79.575128
10.11,8.19,7.66,0.011041,0.053048,0.083008,0.133119,13.889176,65.950391,119.337159,91.006253
15.47,10.67,6.38,0.002016,0.030688,0.12562,0.199841,0.501939,51.154614,195.624496,127.200036
11.4,9.36,5.26,0.008165,0.037452,0.142809,0.217763,4.294001,55.877096,221.043167,131.050256
10.42,11.4,14.42,0.019029,0.041132,0.035492,0.097588,30.314636,57.158768,57.712997,68.169248
15.74,11.64,9.98,0.001834,0.035935,0.070494,0.135131,1.273758,51.168136,100.688275,94.388808
9.6,10,14.66,0.022155,0.049319,0.030585,0.103824,35.059676,73.416591,51.239563,70.437925
10.94,13.34,4.53,0.003528,0.031492,0.183818,0.299727,4.735379,43.048835,282.031569,166.858747
4.84,6.74,13.16,0.073389,0.071397,0.029199,0.101072,137.653813,134.472653,48.028251,67.562731
7.54,15.08,15.77,0.043883,0.021257,0.029221,0.079362,84.905089,40.147259,54.852247,63.765576
13.2,10.43,10.91,0.005424,0.045138,0.058683,0.115856,5.80229,57.211589,79.575384,81.520821
5.22,12.21,11.59,0.065757,0.030992,0.038554,0.087437,137.687928,46.297607,63.262881,64.616355
12,8.6,6.43,0.008815,0.043244,0.109378,0.162352,4.677008,63.000732,164.736376,108.219404
4.17,8.21,8.42,0.079324,0.057529,0.049129,0.089576,159.806855,88.449494,84.383253,64.279121
5.75,11.16,4.73,0.039198,0.024313,0.13419,0.21286,62.87353,40.472625,229.80486,133.110083
9.21,14.18,6.9,0.018554,0.01824,0.114677,0.202976,24.001688,41.080395,193.353257,132.016235
14,11.91,10.33,0.003846,0.035424,0.067055,0.131589,3.928772,50.493478,96.660614,92.572835
11.54,13.05,8.15,0.012378,0.022271,0.095438,0.165794,11.521937,44.468864,154.143346,116.195375
14.93,7.67,13.82,0.002878,0.070911,0.031811,0.127962,3.58333,111.159423,54.470882,83.78636
14.63,14.39,7.54,0.003912,0.02045,0.115028,0.202676,1.699369,43.837282,183.768419,131.646738
7.84,4.77,4.29,0.018894,0.093529,0.145453,0.133002,23.411834,156.258923,217.332673,82.399364
12.97,14.74,11.99,0.009156,0.024787,0.05798,0.126709,11.993929,44.728202,90.297399,90.415749
9.79,9.91,6.04,0.015883,0.029963,0.121542,0.189838,13.885248,51.265605,194.326035,123.057041
10.68,4.54,5.16,0.011105,0.10968,0.148431,0.119606,7.260995,170.788362,186.293663,75.03848
5.09,5.23,10.17,0.011187,0.019287,0.006086,0.026304,8.194955,13.910131,4.450286,6.477902
8.31,13.75,12.36,0.031634,0.025238,0.046339,0.105155,58.569082,41.87826,74.95785,77.251682
6.67,15.4,12.85,0.047674,0.017424,0.040644,0.097687,93.256389,35.43329,69.996239,71.963851
12.48,5.58,14.13,0.015131,0.123724,0.032679,0.144419,9.305156,172.963653,50.031068,95.784168
6.51,8.52,9.95,0.0392,0.047,0.0501,0.102,76.67,80.06,79.45,68.15
5.97,7.83,9.09,0.0442,0.0496,0.0535,0.1026,85.15,88.95,88.87,67.99
5.48,7.24,8.33,0.0492,0.0512,0.06,0.1,94.25,97.9219,97.944,67.6004
tw1,tw2,tw3,Exymax_tw1,Exymax_tw2,Exymax_tw3,Eyymax_tf,TFMmax_tw1,TFMmax_tw2,TFMmax_tw3,TFMmax_frame
4.31,4.2,7.17,0.077278,0.099433,0.07136,0.105858,147.878573,220.768798,101.119942,62.099064
8.9,15.25,14.89,0.041111,0.023281,0.037374,0.086715,84.334946,48.071213,68.923779,69.439653
13.66,6.66,9.52,0.007904,0.078019,0.065014,0.133502,6.504963,139.567651,99.718686,79.629423
6.1,15.97,15.39,0.084848,0.020461,0.026532,0.069238,176.049626,36.821171,54.309485,61.537735
15.13,12.97,13.51,0.007814,0.039871,0.04961,0.110497,10.818199,84.135206,76.816984,77.361745
14.48,9.59,9.39,0.003967,0.049927,0.077467,0.126513,4.344599,74.657887,108.476252,87.625099
8.1,8.81,11.46,0.033116,0.056058,0.043096,0.099836,67.528832,102.267804,74.4842,69.05684
12.35,6.07,10.61,0.012336,0.090698,0.05266,0.137926,11.815061,166.986893,81.734669,87.066158
13.34,12.61,8.52,0.008171,0.026648,0.104446,0.189063,8.131454,53.262055,151.623399,111.763432
6.85,6.25,5.71,0.03181,0.059832,0.100103,0.130728,53.490454,113.244873,171.974781,78.994871
6.25,7.31,15.49,0.054579,0.069377,0.021757,0.101748,119.174198,151.435644,39.256364,74.433643
7.09,5.05,8.99,0.03897,0.090617,0.061091,0.12271,64.699935,189.351435,92.114421,78.061736
8.57,7.08,12.7,0.028071,0.071978,0.034181,0.114246,52.019873,146.43148,60.737429,81.175919
10.11,8.19,7.66,0.017266,0.051398,0.087753,0.134824,21.67616,82.036301,135.029353,87.004476
15.47,10.67,6.38,0.001587,0.032694,0.142549,0.252741,1.006736,57.678082,202.88021,124.987445
11.4,9.36,5.26,0.005596,0.03711,0.155095,0.252796,5.64689,64.072128,227.245419,123.588634
10.42,11.4,14.42,0.025249,0.044515,0.035989,0.092764,47.80185,76.222466,62.650545,69.963801
15.74,11.64,9.98,0.004815,0.036171,0.084571,0.139391,3.374214,60.499531,114.112073,95.184938
9.6,10,14.66,0.028564,0.047937,0.031012,0.101013,54.119329,96.883445,55.770232,72.066717
10.94,13.34,4.53,0.003914,0.031012,0.196972,0.331538,4.832536,45.660551,292.305359,157.641734
4.84,6.74,13.16,0.093097,0.097875,0.028174,0.094645,170.965164,159.699879,43.676179,65.264807
7.54,15.08,15.77,0.055862,0.02449,0.029117,0.073081,119.046303,45.763978,57.961619,60.585659
13.2,10.43,10.91,0.008903,0.047605,0.063671,0.111617,11.29028,69.441193,90.925046,82.0043
5.22,12.21,11.59,0.085896,0.031263,0.036812,0.073377,186.946525,55.224328,65.471473,58.960515
12,8.6,6.43,0.008114,0.044405,0.118614,0.191841,6.500439,75.48923,173.447598,104.276507
4.17,8.21,8.42,0.09005,0.064804,0.046111,0.079526,203.336388,97.727798,87.185754,57.875615
5.75,11.16,4.73,0.04199,0.023716,0.141726,0.222952,82.296901,42.457106,224.319044,119.086143
9.21,14.18,6.9,0.020821,0.023865,0.125409,0.252293,33.232262,45.762109,197.540708,128.04103
14,11.91,10.33,0.009023,0.03481,0.079382,0.135912,8.780833,58.836188,109.919328,93.306098
11.54,13.05,8.15,0.012983,0.023028,0.107911,0.199645,17.749952,50.523653,161.438746,115.199793
14.93,7.67,13.82,0.005025,0.079958,0.030859,0.130618,6.560153,141.413863,57.208313,89.048771
14.63,14.39,7.54,0.003689,0.022627,0.136104,0.265338,2.588103,49.771496,197.612032,133.781791
7.84,4.77,4.29,0.017731,0.099466,0.145601,0.134762,27.310518,165.862042,224.604753,80.570982
12.97,14.74,11.99,0.013823,0.025475,0.066175,0.126926,21.895116,52.727872,103.480296,91.528423
9.79,9.91,6.04,0.014588,0.030492,0.133053,0.225956,18.948648,58.155948,195.895431,117.007638
10.68,4.54,5.16,0.008352,0.112029,0.145064,0.119875,10.711904,188.86704,193.356306,74.434275
5.09,5.23,10.17,0.073286,0.095812,0.04163,0.107967,132.255626,197.295555,67.287296,71.347153
8.31,13.75,12.36,0.03989,0.026134,0.048687,0.09771,81.84379,48.120038,81.9614,74.892055
6.67,15.4,12.85,0.062848,0.019939,0.0412,0.086729,132.394523,38.787373,74.375273,67.490807
12.48,5.58,14.13,0.01392,0.113775,0.026927,0.142716,14.282247,205.971864,45.934162,101.084258
10.282,14.4686,15.9876,0.0321,0.0298,0.0342,0.0842,63.471,55.44,64.2316,69.4574
7.76,9.66,11.05,0.037,0.0419,0.0492,0.0957,76.1284,84.4513,77.7551,68.3479
6.99,8.73,9.78,0.04,0.0499,0.0519,0.0954,85.313,94.7645,89.5855,68.2746
6.35,8,8.78,0.0432,0.0512,0.0569,0.096,93.7124,103.0082,100.4389,67.799
output,model,cv_rmse,cv_mae,cv_r2,BEST_PARAMS,fit_time_sec,gpr_kernel
exymax_tw1,SVR,0.0043375719853902175,0.0043375719853902175,,"{""svr__C"": 1776.5766649807683, ""svr__epsilon"": 0.00032780432870046914, ""svr__gamma"": 0.006225026900894044}",16.42,
exymax_tw1,FlexibleMLP,0.005942000225043672,0.005942000225043672,,"{""mlp__activation"": ""relu"", ""mlp__alpha"": 0.001383720431131858, ""mlp__learning_rate_init"": 0.00560022041574592, ""mlp__n_layers"": 5, ""mlp__n_neurons"": 211}",180.61,
exymax_tw1,GradientBoosting,0.007811985938625476,0.007811985938625476,,"{""learning_rate"": 0.07137457847237384, ""max_depth"": 1, ""max_features"": 0.9840282197519947, ""n_estimators"": 1211, ""subsample"": 0.7574418333755193}",25.08,
exymax_tw1,GaussianProcess,0.007839157896198268,0.007839157896198268,,"{""gpr__amplitude"": 17.634356372428602, ""gpr__kernel_type"": ""Matern32"", ""gpr__length_scale"": 9.748618754819589, ""gpr__n_restarts_optimizer"": 2, ""gpr__noise"": 8.217425987016864e-08, ""gpr__rq_alpha"": 1.2480582138615424}",20.72,"2.23**2 * Matern(length_scale=4.85, nu=1.5) + WhiteKernel(noise_level=8.22e-08)"
exymax_tw1,XGBoost,0.013519485088437793,0.013519485088437793,,"{""colsample_bytree"": 0.8326890370087766, ""learning_rate"": 0.013163689227404577, ""max_depth"": 4, ""min_child_weight"": 3, ""n_estimators"": 686, ""subsample"": 0.9457892792724425}",19.04,
exymax_tw1,RandomForest,0.015878616417497236,0.015878616417497236,,"{""max_depth"": 4, ""max_features"": 0.829638871689417, ""min_samples_leaf"": 2, ""min_samples_split"": 4, ""n_estimators"": 301}",32.82,
exymax_tw2,SVR,0.003898200989668,0.003898200989668,,"{""svr__C"": 1776.5766649807683, ""svr__epsilon"": 0.00032780432870046914, ""svr__gamma"": 0.006225026900894044}",17.13,
exymax_tw2,GaussianProcess,0.004092552978458643,0.004092552978458643,,"{""gpr__amplitude"": 17.76576664980768, ""gpr__kernel_type"": ""RBF"", ""gpr__length_scale"": 2.467108843522573, ""gpr__n_restarts_optimizer"": 4, ""gpr__noise"": 2.3000512910597964e-12, ""gpr__rq_alpha"": 0.024337729656929853}",22.16,2.65**2 * RBF(length_scale=3.93) + WhiteKernel(noise_level=2.3e-12)
exymax_tw2,FlexibleMLP,0.00864041994087298,0.00864041994087298,,"{""mlp__activation"": ""relu"", ""mlp__alpha"": 0.00032616232211419725, ""mlp__learning_rate_init"": 0.005773101392423683, ""mlp__n_layers"": 5, ""mlp__n_neurons"": 138}",261.75,
exymax_tw2,GradientBoosting,0.011504378515554227,0.011504378515554227,,"{""learning_rate"": 0.06709750202787818, ""max_depth"": 1, ""max_features"": 0.8738161757602809, ""n_estimators"": 1265, ""subsample"": 0.855691364449425}",24.33,
exymax_tw2,XGBoost,0.015490814410895112,0.015490814410895112,,"{""colsample_bytree"": 0.8669017799032459, ""learning_rate"": 0.011355860050299323, ""max_depth"": 1, ""min_child_weight"": 3, ""n_estimators"": 920, ""subsample"": 0.7377357738731759}",18.01,
exymax_tw2,RandomForest,0.020620972368113888,0.020620972368113888,,"{""max_depth"": 3, ""max_features"": 0.6190342545380507, ""min_samples_leaf"": 2, ""min_samples_split"": 3, ""n_estimators"": 204}",29.02,
eyymax_tf,GradientBoosting,0.006096959350644509,0.006096959350644509,,"{""learning_rate"": 0.07929606546342034, ""max_depth"": 1, ""max_features"": 0.7452453063052724, ""n_estimators"": 1273, ""subsample"": 0.8474562788291421}",25.41,
eyymax_tf,SVR,0.006257786413308531,0.006257786413308531,,"{""svr__C"": 135.78050704417745, ""svr__epsilon"": 0.006585522584687186, ""svr__gamma"": 0.00018964721502153097}",18.45,
eyymax_tf,FlexibleMLP,0.006448221187257262,0.006448221187257262,,"{""mlp__activation"": ""relu"", ""mlp__alpha"": 0.0011571030999086114, ""mlp__learning_rate_init"": 0.0009804697273478402, ""mlp__n_layers"": 5, ""mlp__n_neurons"": 316}",147.51,
eyymax_tf,XGBoost,0.00842710229307413,0.00842710229307413,,"{""colsample_bytree"": 0.758633921460061, ""learning_rate"": 0.016926614610268422, ""max_depth"": 1, ""min_child_weight"": 3, ""n_estimators"": 1154, ""subsample"": 0.7748922483843458}",21.07,
eyymax_tf,GaussianProcess,0.012449657479256554,0.012449657479256554,,"{""gpr__amplitude"": 90.45678397888028, ""gpr__kernel_type"": ""Matern32"", ""gpr__length_scale"": 8.516741937780402, ""gpr__n_restarts_optimizer"": 1, ""gpr__noise"": 3.6849095550901595e-05, ""gpr__rq_alpha"": 7.496075352927174}",21.5,"1.22**2 * Matern(length_scale=1.97, nu=1.5) + WhiteKernel(noise_level=3.71e-05)"
eyymax_tf,RandomForest,0.015233420798747652,0.015233420798747652,,"{""max_depth"": 2, ""max_features"": 0.705085509976784, ""min_samples_leaf"": 2, ""min_samples_split"": 3, ""n_estimators"": 327}",35.21,
tfmmax_frame,SVR,1.855708614995125,1.855708614995125,,"{""svr__C"": 869.6420997068749, ""svr__epsilon"": 0.005970174820257551, ""svr__gamma"": 0.013624631381822645}",16.32,
tfmmax_frame,GaussianProcess,2.6515387392115812,2.6515387392115812,,"{""gpr__amplitude"": 15.783879853890564, ""gpr__kernel_type"": ""Matern32"", ""gpr__length_scale"": 1.2778531518898433, ""gpr__n_restarts_optimizer"": 4, ""gpr__noise"": 3.279409439150647e-05, ""gpr__rq_alpha"": 3.0182479639711155}",21.39,"2.81**2 * Matern(length_scale=6.87, nu=1.5) + WhiteKernel(noise_level=3.28e-05)"
tfmmax_frame,GradientBoosting,3.0739719779986867,3.0739719779986867,,"{""learning_rate"": 0.06319043436330006, ""max_depth"": 1, ""max_features"": 0.7921866614153864, ""n_estimators"": 1126, ""subsample"": 0.7129971555911094}",29.49,
tfmmax_frame,XGBoost,6.187755622863769,6.187755622863769,,"{""colsample_bytree"": 0.8248153132081696, ""learning_rate"": 0.01950074290417516, ""max_depth"": 4, ""min_child_weight"": 3, ""n_estimators"": 1005, ""subsample"": 0.7001714851999588}",16.65,
tfmmax_frame,RandomForest,8.366368509903962,8.366368509903962,,"{""max_depth"": 4, ""max_features"": 0.9254599542742846, ""min_samples_leaf"": 2, ""min_samples_split"": 3, ""n_estimators"": 238}",27.92,
tfmmax_frame,FlexibleMLP,9.920601305703192,9.920601305703192,,"{""mlp__activation"": ""tanh"", ""mlp__alpha"": 1.1637548640792169e-05, ""mlp__learning_rate_init"": 0.0013892406853974423, ""mlp__n_layers"": 4, ""mlp__n_neurons"": 250}",278.87,
tfmmax_tw1,SVR,6.514565955904576,6.514565955904576,,"{""svr__C"": 6450.78941362485, ""svr__epsilon"": 0.05487965008895495, ""svr__gamma"": 0.01812192186911149}",15.24,
tfmmax_tw1,FlexibleMLP,9.284068888931218,9.284068888931218,,"{""mlp__activation"": ""relu"", ""mlp__alpha"": 0.0008992483274742313, ""mlp__learning_rate_init"": 0.002918401243823751, ""mlp__n_layers"": 2, ""mlp__n_neurons"": 156}",95.56,
tfmmax_tw1,GaussianProcess,14.30296148005305,14.30296148005305,,"{""gpr__amplitude"": 18.219591934358792, ""gpr__kernel_type"": ""Matern52"", ""gpr__length_scale"": 4.875265952210847, ""gpr__n_restarts_optimizer"": 7, ""gpr__noise"": 2.2546533558290293e-05, ""gpr__rq_alpha"": 51.601430103129445}",21.08,"2**2 * Matern(length_scale=3.32, nu=2.5) + WhiteKernel(noise_level=3.57e-09)"
tfmmax_tw1,GradientBoosting,14.381252311451142,14.381252311451142,,"{""learning_rate"": 0.07823837540818114, ""max_depth"": 1, ""max_features"": 0.9853042573891722, ""n_estimators"": 1135, ""subsample"": 0.8457490834822092}",26.54,
tfmmax_tw1,RandomForest,34.74099251727506,34.74099251727506,,"{""max_depth"": 3, ""max_features"": 0.9287094401669573, ""min_samples_leaf"": 2, ""min_samples_split"": 4, ""n_estimators"": 317}",26.84,
tfmmax_tw1,XGBoost,38.753928146362306,38.753928146362306,,"{""colsample_bytree"": 0.8230311876559941, ""learning_rate"": 0.08846938749167613, ""max_depth"": 4, ""min_child_weight"": 8, ""n_estimators"": 434, ""subsample"": 0.7101294823253874}",15.98,
tfmmax_tw2,FlexibleMLP,9.543413656978952,9.543413656978952,,"{""mlp__activation"": ""relu"", ""mlp__alpha"": 5.7116133827395183e-05, ""mlp__learning_rate_init"": 0.00551761396114586, ""mlp__n_layers"": 5, ""mlp__n_neurons"": 193}",141.62,
tfmmax_tw2,SVR,11.570888179275114,11.570888179275114,,"{""svr__C"": 1748.5871627909407, ""svr__epsilon"": 0.007454032341804314, ""svr__gamma"": 0.0027015027768465026}",15.91,
tfmmax_tw2,GaussianProcess,12.533526604666376,12.533526604666376,,"{""gpr__amplitude"": 40.087236815917386, ""gpr__kernel_type"": ""Matern52"", ""gpr__length_scale"": 0.44330391401240743, ""gpr__n_restarts_optimizer"": 6, ""gpr__noise"": 7.062740227125941e-05, ""gpr__rq_alpha"": 0.015997026868562338}",22.25,"2.89**2 * Matern(length_scale=5.43, nu=2.5) + WhiteKernel(noise_level=0.01)"
tfmmax_tw2,GradientBoosting,15.831571838897695,15.831571838897695,,"{""learning_rate"": 0.0602220933692525, ""max_depth"": 1, ""max_features"": 0.7464566335853476, ""n_estimators"": 947, ""subsample"": 0.8134047096935451}",24.05,
tfmmax_tw2,XGBoost,25.7699333190918,25.7699333190918,,"{""colsample_bytree"": 0.9409585819009745, ""learning_rate"": 0.012332685611801043, ""max_depth"": 1, ""min_child_weight"": 3, ""n_estimators"": 456, ""subsample"": 0.7557823344596944}",19.17,
tfmmax_tw2,RandomForest,29.346652758699634,29.346652758699634,,"{""max_depth"": 3, ""max_features"": 0.9731382097529866, ""min_samples_leaf"": 2, ""min_samples_split"": 2, ""n_estimators"": 208}",26.25,
output,model,cv_rmse,cv_mae,cv_r2,BEST_PARAMS,fit_time_sec,gpr_kernel
exymax_tw1,SVR,0.001189142247241681,0.001189142247241681,,"{""svr__C"": 404.07318534699954, ""svr__epsilon"": 0.00016710787988155612, ""svr__gamma"": 0.004514122336435558}",18.23,
exymax_tw1,FlexibleMLP,0.005048090505980366,0.005048090505980366,,"{""mlp__activation"": ""relu"", ""mlp__alpha"": 0.0008838799592906985, ""mlp__learning_rate_init"": 0.004177632703117597, ""mlp__n_layers"": 5, ""mlp__n_neurons"": 436}",148.5,
exymax_tw1,GaussianProcess,0.006312093784678959,0.006312093784678959,,"{""gpr__amplitude"": 17.76576664980768, ""gpr__kernel_type"": ""RBF"", ""gpr__length_scale"": 2.467108843522573, ""gpr__n_restarts_optimizer"": 4, ""gpr__noise"": 2.3000512910597964e-12, ""gpr__rq_alpha"": 0.024337729656929853}",23.57,3.71**2 * RBF(length_scale=4.05) + WhiteKernel(noise_level=2.3e-12)
exymax_tw1,GradientBoosting,0.00895457876536699,0.00895457876536699,,"{""learning_rate"": 0.07902213201887003, ""max_depth"": 1, ""max_features"": 0.6349185526665098, ""n_estimators"": 215, ""subsample"": 0.8333797991689352}",23.27,
exymax_tw1,XGBoost,0.02133591138198972,0.02133591138198972,,"{""colsample_bytree"": 0.7566359703826661, ""learning_rate"": 0.030021399042486907, ""max_depth"": 2, ""min_child_weight"": 3, ""n_estimators"": 718, ""subsample"": 0.8726674782089092}",20.03,
exymax_tw1,RandomForest,0.023419556369556352,0.023419556369556352,,"{""max_depth"": 4, ""max_features"": 0.9153682682439178, ""min_samples_leaf"": 2, ""min_samples_split"": 2, ""n_estimators"": 351}",33.54,
exymax_tw2,SVR,0.006084823423825638,0.006084823423825638,,"{""svr__C"": 1776.5766649807683, ""svr__epsilon"": 0.00032780432870046914, ""svr__gamma"": 0.006225026900894044}",18.11,
exymax_tw2,GaussianProcess,0.006908668049211099,0.006908668049211099,,"{""gpr__amplitude"": 4.638469520725609, ""gpr__kernel_type"": ""Matern52"", ""gpr__length_scale"": 1.2053291263217372, ""gpr__n_restarts_optimizer"": 1, ""gpr__noise"": 8.248828845479934e-12, ""gpr__rq_alpha"": 5.102360139565328}",22.58,"4.44**2 * Matern(length_scale=7.88, nu=2.5) + WhiteKernel(noise_level=8.25e-12)"
exymax_tw2,FlexibleMLP,0.007996256320163552,0.007996256320163552,,"{""mlp__activation"": ""relu"", ""mlp__alpha"": 0.000880321286683897, ""mlp__learning_rate_init"": 0.007060463996516053, ""mlp__n_layers"": 5, ""mlp__n_neurons"": 411}",322.83,
exymax_tw2,GradientBoosting,0.016576246300075752,0.016576246300075752,,"{""learning_rate"": 0.07576830092585919, ""max_depth"": 1, ""max_features"": 0.6050680737118149, ""n_estimators"": 1450, ""subsample"": 0.7795220681302544}",26.35,
exymax_tw2,XGBoost,0.022073993966728444,0.022073993966728444,,"{""colsample_bytree"": 0.7829626166649746, ""learning_rate"": 0.016734999863756417, ""max_depth"": 3, ""min_child_weight"": 3, ""n_estimators"": 450, ""subsample"": 0.7074825009207435}",17.65,
exymax_tw2,RandomForest,0.027188869611825754,0.027188869611825754,,"{""max_depth"": 2, ""max_features"": 0.8266845321307337, ""min_samples_leaf"": 2, ""min_samples_split"": 4, ""n_estimators"": 211}",28.54,
eyymax_tf,GradientBoosting,0.011193199924578985,0.011193199924578985,,"{""learning_rate"": 0.07921529866641679, ""max_depth"": 1, ""max_features"": 0.612242984332449, ""n_estimators"": 1057, ""subsample"": 0.7050910309575588}",28.34,
eyymax_tf,SVR,0.011857465998559311,0.011857465998559311,,"{""svr__C"": 2826.0748057569035, ""svr__epsilon"": 0.004530987105253192, ""svr__gamma"": 0.00011866053088778852}",17.84,
eyymax_tf,FlexibleMLP,0.01314030766673895,0.01314030766673895,,"{""mlp__activation"": ""tanh"", ""mlp__alpha"": 0.0057547661357822905, ""mlp__learning_rate_init"": 0.0009838602333915323, ""mlp__n_layers"": 5, ""mlp__n_neurons"": 131}",121.31,
eyymax_tf,XGBoost,0.016556165184080603,0.016556165184080603,,"{""colsample_bytree"": 0.8498312216775288, ""learning_rate"": 0.12111332148440783, ""max_depth"": 4, ""min_child_weight"": 3, ""n_estimators"": 255, ""subsample"": 0.837230568715668}",16.76,
eyymax_tf,RandomForest,0.02332388150609082,0.02332388150609082,,"{""max_depth"": 3, ""max_features"": 0.8430653927533585, ""min_samples_leaf"": 2, ""min_samples_split"": 2, ""n_estimators"": 258}",33.86,
eyymax_tf,GaussianProcess,0.023338799620304204,0.023338799620304204,,"{""gpr__amplitude"": 0.07803438686428588, ""gpr__kernel_type"": ""Matern32"", ""gpr__length_scale"": 0.2078965734853746, ""gpr__n_restarts_optimizer"": 1, ""gpr__noise"": 8.434474667849261e-05, ""gpr__rq_alpha"": 10.123077209064054}",21.25,"1**2 * Matern(length_scale=1, nu=1.5) + WhiteKernel(noise_level=8.61e-05)"
tfmmax_frame,SVR,3.7496112763926472,3.7496112763926472,,"{""svr__C"": 888.0357335845127, ""svr__epsilon"": 0.00027560887878157734, ""svr__gamma"": 0.011255115229536036}",17.5,
tfmmax_frame,GradientBoosting,4.8270125720839445,4.8270125720839445,,"{""learning_rate"": 0.0746851190210031, ""max_depth"": 1, ""max_features"": 0.9587765657308649, ""n_estimators"": 1089, ""subsample"": 0.7583363101118213}",26.78,
tfmmax_frame,GaussianProcess,4.837254198940551,4.837254198940551,,"{""gpr__amplitude"": 50.74746039780979, ""gpr__kernel_type"": ""RQ"", ""gpr__length_scale"": 3.2803146222012094, ""gpr__n_restarts_optimizer"": 10, ""gpr__noise"": 2.8202012317991376e-05, ""gpr__rq_alpha"": 0.011355724181092134}",21.96,"2.13**2 * RationalQuadratic(alpha=1e+03, length_scale=3.12) + WhiteKernel(noise_level=0.01)"
tfmmax_frame,RandomForest,11.444069767701095,11.444069767701095,,"{""max_depth"": 4, ""max_features"": 0.9434532776047838, ""min_samples_leaf"": 2, ""min_samples_split"": 2, ""n_estimators"": 267}",29.34,
tfmmax_frame,FlexibleMLP,11.539183854648009,11.539183854648009,,"{""mlp__activation"": ""tanh"", ""mlp__alpha"": 7.12939562970623e-05, ""mlp__learning_rate_init"": 0.003177327901978699, ""mlp__n_layers"": 2, ""mlp__n_neurons"": 199}",174.36,
tfmmax_frame,XGBoost,16.035713996887207,16.035713996887207,,"{""colsample_bytree"": 0.8230311876559941, ""learning_rate"": 0.08846938749167613, ""max_depth"": 4, ""min_child_weight"": 8, ""n_estimators"": 434, ""subsample"": 0.7101294823253874}",18.49,
tfmmax_tw1,FlexibleMLP,7.568684427554417,7.568684427554417,,"{""mlp__activation"": ""relu"", ""mlp__alpha"": 0.007095015676184179, ""mlp__learning_rate_init"": 0.0010350129402744886, ""mlp__n_layers"": 1, ""mlp__n_neurons"": 93}",100.03,
tfmmax_tw1,SVR,8.039932143299065,8.039932143299065,,"{""svr__C"": 1994.8647978722083, ""svr__epsilon"": 0.00021654001099593486, ""svr__gamma"": 0.03888686900735536}",15.73,
tfmmax_tw1,GaussianProcess,13.740675480841517,13.740675480841517,,"{""gpr__amplitude"": 15.783879853890564, ""gpr__kernel_type"": ""Matern32"", ""gpr__length_scale"": 1.2778531518898433, ""gpr__n_restarts_optimizer"": 4, ""gpr__noise"": 3.279409439150647e-05, ""gpr__rq_alpha"": 3.0182479639711155}",21.74,"2.32**2 * Matern(length_scale=5.45, nu=1.5) + WhiteKernel(noise_level=4.01e-09)"
tfmmax_tw1,GradientBoosting,17.141373703495926,17.141373703495926,,"{""learning_rate"": 0.07396113057663836, ""max_depth"": 1, ""max_features"": 0.9717797606360448, ""n_estimators"": 1293, ""subsample"": 0.7123782305265941}",25.71,
tfmmax_tw1,RandomForest,42.146722970181784,42.146722970181784,,"{""max_depth"": 3, ""max_features"": 0.9287094401669573, ""min_samples_leaf"": 2, ""min_samples_split"": 4, ""n_estimators"": 317}",26.84,
tfmmax_tw1,XGBoost,48.06999996185303,48.06999996185303,,"{""colsample_bytree"": 0.8230311876559941, ""learning_rate"": 0.08846938749167613, ""max_depth"": 4, ""min_child_weight"": 8, ""n_estimators"": 434, ""subsample"": 0.7101294823253874}",17.01,
tfmmax_tw2,SVR,11.314099136094795,11.314099136094795,,"{""svr__C"": 1932.2768280341643, ""svr__epsilon"": 0.0002731420424984291, ""svr__gamma"": 0.002631082857769719}",17.78,
tfmmax_tw2,GaussianProcess,11.740411233920078,11.740411233920078,,"{""gpr__amplitude"": 45.40039827973866, ""gpr__kernel_type"": ""Matern52"", ""gpr__length_scale"": 4.039013182877404, ""gpr__n_restarts_optimizer"": 9, ""gpr__noise"": 8.056080173461021e-05, ""gpr__rq_alpha"": 3.2552168293637562}",23.85,"3.2**2 * Matern(length_scale=6, nu=2.5) + WhiteKernel(noise_level=0.00705)"
tfmmax_tw2,FlexibleMLP,16.143769361330094,16.143769361330094,,"{""mlp__activation"": ""relu"", ""mlp__alpha"": 2.1364117995410688e-05, ""mlp__learning_rate_init"": 0.008607998150089705, ""mlp__n_layers"": 1, ""mlp__n_neurons"": 296}",153.78,
tfmmax_tw2,GradientBoosting,20.973005630229387,20.973005630229387,,"{""learning_rate"": 0.06776310572559845, ""max_depth"": 1, ""max_features"": 0.718511154584248, ""n_estimators"": 551, ""subsample"": 0.8307384126536832}",26.92,
tfmmax_tw2,XGBoost,29.3582400894165,29.3582400894165,,"{""colsample_bytree"": 0.883958676078517, ""learning_rate"": 0.012828797968407808, ""max_depth"": 4, ""min_child_weight"": 3, ""n_estimators"": 659, ""subsample"": 0.9108553797312076}",23.9,
tfmmax_tw2,RandomForest,37.69341472763347,37.69341472763347,,"{""max_depth"": 4, ""max_features"": 0.9849854754087094, ""min_samples_leaf"": 2, ""min_samples_split"": 2, ""n_estimators"": 330}",29.01,
output,best_model,cv_rmse,cv_mae,cv_r2,BEST_PARAMS,model_path,train_time_sec,gpr_kernel
exymax_tw2,SVR,0.003898200989668,0.003898200989668,,"{""svr__C"": 1776.5766649807683, ""svr__epsilon"": 0.00032780432870046914, ""svr__gamma"": 0.006225026900894044}",../../models/width_optimization/2W/per_output_models_B29_H30/best_model_exymax_tw2.joblib,372.41,
exymax_tw1,SVR,0.0043375719853902175,0.0043375719853902175,,"{""svr__C"": 1776.5766649807683, ""svr__epsilon"": 0.00032780432870046914, ""svr__gamma"": 0.006225026900894044}",../../models/width_optimization/2W/per_output_models_B29_H30/best_model_exymax_tw1.joblib,294.69,
eyymax_tf,GradientBoosting,0.006096959350644509,0.006096959350644509,,"{""learning_rate"": 0.07929606546342034, ""max_depth"": 1, ""max_features"": 0.7452453063052724, ""n_estimators"": 1273, ""subsample"": 0.8474562788291421}",../../models/width_optimization/2W/per_output_models_B29_H30/best_model_eyymax_tf.joblib,269.21,
tfmmax_frame,SVR,1.855708614995125,1.855708614995125,,"{""svr__C"": 869.6420997068749, ""svr__epsilon"": 0.005970174820257551, ""svr__gamma"": 0.013624631381822645}",../../models/width_optimization/2W/per_output_models_B29_H30/best_model_tfmmax_frame.joblib,390.64,
tfmmax_tw1,SVR,6.514565955904576,6.514565955904576,,"{""svr__C"": 6450.78941362485, ""svr__epsilon"": 0.05487965008895495, ""svr__gamma"": 0.01812192186911149}",../../models/width_optimization/2W/per_output_models_B29_H30/best_model_tfmmax_tw1.joblib,201.26,
tfmmax_tw2,FlexibleMLP,9.543413656978952,9.543413656978952,,"{""mlp__activation"": ""relu"", ""mlp__alpha"": 5.7116133827395183e-05, ""mlp__learning_rate_init"": 0.00551761396114586, ""mlp__n_layers"": 5, ""mlp__n_neurons"": 193}",../../models/width_optimization/2W/per_output_models_B29_H30/best_model_tfmmax_tw2.joblib,249.26,
output,best_model,cv_rmse,cv_mae,cv_r2,BEST_PARAMS,model_path,train_time_sec,gpr_kernel
exymax_tw1,SVR,0.001189142247241681,0.001189142247241681,,"{""svr__C"": 404.07318534699954, ""svr__epsilon"": 0.00016710787988155612, ""svr__gamma"": 0.004514122336435558}",../../models/width_optimization/2W/per_output_models_B34_H30/best_model_exymax_tw1.joblib,267.15,
exymax_tw2,SVR,0.006084823423825638,0.006084823423825638,,"{""svr__C"": 1776.5766649807683, ""svr__epsilon"": 0.00032780432870046914, ""svr__gamma"": 0.006225026900894044}",../../models/width_optimization/2W/per_output_models_B34_H30/best_model_exymax_tw2.joblib,436.07,
eyymax_tf,GradientBoosting,0.011193199924578985,0.011193199924578985,,"{""learning_rate"": 0.07921529866641679, ""max_depth"": 1, ""max_features"": 0.612242984332449, ""n_estimators"": 1057, ""subsample"": 0.7050910309575588}",../../models/width_optimization/2W/per_output_models_B34_H30/best_model_eyymax_tf.joblib,239.43,
tfmmax_frame,SVR,3.7496112763926472,3.7496112763926472,,"{""svr__C"": 888.0357335845127, ""svr__epsilon"": 0.00027560887878157734, ""svr__gamma"": 0.011255115229536036}",../../models/width_optimization/2W/per_output_models_B34_H30/best_model_tfmmax_frame.joblib,288.43,
tfmmax_tw1,FlexibleMLP,7.568684427554417,7.568684427554417,,"{""mlp__activation"": ""relu"", ""mlp__alpha"": 0.007095015676184179, ""mlp__learning_rate_init"": 0.0010350129402744886, ""mlp__n_layers"": 1, ""mlp__n_neurons"": 93}",../../models/width_optimization/2W/per_output_models_B34_H30/best_model_tfmmax_tw1.joblib,207.07,
tfmmax_tw2,SVR,11.314099136094795,11.314099136094795,,"{""svr__C"": 1932.2768280341643, ""svr__epsilon"": 0.0002731420424984291, ""svr__gamma"": 0.002631082857769719}",../../models/width_optimization/2W/per_output_models_B34_H30/best_model_tfmmax_tw2.joblib,275.25,
output,LOO_RMSE,LOO_MAE,LOO_R2
exymax_tw1,0.009031096283822699,0.00639049667196738,0.7488425908078558
exymax_tw2,0.010144034382549304,0.007481598874488774,0.8507509205564329
eyymax_tf,0.01855943741358681,0.014915287653860793,0.20898177606080248
tfmmax_tw1,16.482900682149094,11.922266262795016,0.8203483103328286
tfmmax_tw2,13.634479284114308,12.461129872354427,0.8760890162159998
tfmmax_frame,13.655098463858259,8.863927376532523,-0.3693424823899063
output,LOO_RMSE,LOO_MAE,LOO_R2
exymax_tw1,0.008624147551296939,0.007359782796953446,0.9079024175676164
exymax_tw2,0.013706392904955044,0.011269859697382568,0.8523816279486677
eyymax_tf,0.03046368205400917,0.026968235521929725,0.19941647286709363
tfmmax_tw1,20.167678560728703,14.961081819799762,0.8294924002148749
tfmmax_tw2,18.43696053024691,15.196129032214937,0.8511144659365918
tfmmax_frame,16.464322002393846,10.135962969435168,-0.11391813227750935
Parameter,Value
Configuration_W,2.0
Configuration_B,29.0
Configuration_H,30.0
Configuration_TFD_W,90.0
tw1_optimal,12.428504327084426
tw2_optimal,14.122563211721598
Objective_score,0.00016248777890609986
Exy_tw1,0.04570201477839575
Exy_tw2,0.06527161928725705
TFM_tw1,90.0027179003182
TFM_tw2,90.00000000109232
TFM_frame,90.01252013760173
Parameter,Value
Configuration_W,2.0
Configuration_B,34.0
Configuration_H,30.0
Configuration_TFD_W,90.0
tw1_optimal,15.377136277142208
tw2_optimal,18.215164279262194
Objective_score,180.3879020502771
Exy_tw1,0.04956946143084351
Exy_tw2,0.05571476809143794
TFM_tw1,94.84627564795838
TFM_tw2,90.99630804401522
TFM_frame,102.4863480260018
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
Parameter,Value
Configuration_W,2.0
Configuration_B,29.0
Configuration_H,30.0
Configuration_TFD_W,90.0
tw1_optimal,12.336370134679157
tw2_optimal,14.334462882300853
Objective_score,1.2640532073479944
Exy_tw1,0.047528218580697175
Exy_tw2,0.06218794179329757
TFM_tw1,90.00000003127688
TFM_tw2,90.01241648568478
TFM_frame,88.73609935656434
Parameter,Value
Configuration_W,2.0
Configuration_B,34.0
Configuration_H,30.0
Configuration_TFD_W,90.0
tw1_optimal,16.415097965898788
tw2_optimal,19.867887348820467
Objective_score,115.10663647284584
Exy_tw1,0.0472525132964864
Exy_tw2,0.050186561379984236
TFM_tw1,92.59810789514367
TFM_tw2,76.2781438489129
TFM_frame,99.72803257868077
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
# Latin Hypercube Sampling Generator for RESILINK
## Overview
This script generates Latin Hypercube Sampling (LHS) cases for varying geometric features. It is specifically designed to generate datasets for the RESILINK project, which can be applied to the FEM software developed during the project. It utilizes Latin Hypercube Sampling (LHS) to ensure a well-distributed sampling of the feature space.
The script defines a set number of calculations and material parameter ranges, applying **maximin LHS** to ensure well-distributed samples. The generated samples are saved in CSV and Excel formats for further processing.
## Files
- **`LatinHypercubeSampleGeneratorRESILINK.py`**: Generates LHS cases by sampling material properties and exporting the results.
## Dependencies
This script requires the following Python libraries:
- `numpy`
- `pandas`
- `pyDOE`
You can install them using:
```sh
pip install numpy pandas pyDOE
```
## Usage
### Running the Script
```sh
python LatinHypercubeSampleGeneratorRESILINK.py
```
This script:
1. Defines material parameter ranges.
2. Generates **n** LHS samples using **maximin** criteria.
3. Saves the sampled data to `ml_features.csv` and `ml_features.xlsx`.
## Output
The script generates and saves sampled material parameter sets in:
- `ml_features.csv` (Comma-Separated Values format)
- `ml_features.xlsx` (Excel format)
These files contain sampled values for parameters:
- `Model` (Sample identifier)
- **m** diferent geometric featrues.
# -*- coding: utf-8 -*-
"""
Created on Tue Jan 10 08:23:00 2023.
@author: jig
"""
from pyDOE import lhs
import pandas as pd
class GenerateLHSCases(object):
"""
Class for generating a number of cases by varying a set of geometric features.
This class is used to generate datasets for the RESILINK project, which can be
applied to the FEM software developed during the project. It utilizes Latin
Hypercube Sampling (LHS) to ensure a well-distributed sampling of the feature
space.
"""
def __init__(self):
"""Define the desired parameters for this case."""
# Info available in: https://github.com/tisimst/pyDOE/blob/master/pyDOE/doe_lhs.py
self.n_calculations = 64 # Number of desired calculations
self.criterion = 'maximin' # 'center', 'maximin', 'centermaximin', 'correlation'
self.iterations = 5 # Number of iterations for maximin and correlation (Default: 5)
self.ranges = [[4.0, 14.0],
[4.0, 14.0],
[4.0, 14.0],
[4.0, 14.0],
# [4.0, 14.0],
[4.0, 14.0]]
self.lhs_parameters = None
def run(self):
"""Execute the generation of cases."""
self.create_lhs_design()
self.generate_features_dataframe()
def create_lhs_design(self):
"""Create the Latin Hypercube Sampling design."""
self.lhs_parameters = lhs(len(self.ranges),
samples=self.n_calculations,
criterion=self.criterion)
def generate_features_dataframe(self):
"""Generate the DataFrame from the Latin Hypercube Sampling design."""
features = [
[f'ml{ii + 1}'] + [
round(r[0] + (r[1] - r[0]) * self.lhs_parameters[ii][i], 2)
for i, r in enumerate(self.ranges)
]
for ii in range(self.n_calculations)
]
# Define the column names for the DataFrame
column_names = ['Model', 'tw1', 'tw2', 'tw3', 'tw4', 'tw5',] # 'tf']
# Create the DataFrame
features_df = pd.DataFrame(features, columns=column_names)
# Save as CSV and Excel
features_df.to_csv('ml_features.csv', index=False)
features_df.to_excel('ml_features.xlsx', index=False)
print("DataFrame saved as 'ml_features.csv' and 'ml_features.xlsx'.")
# Example usage
if __name__ == "__main__":
generator = GenerateLHSCases()
generator.run()
Model,tw1,tw2,tw3,tw4,tw5
ml1,12.17,12.65,8.1,6.7,9.6
ml2,6.68,10.4,9.11,12.56,9.08
ml3,9.13,7.6,6.67,8.31,6.48
ml4,5.49,8.73,13.49,10.63,12.22
ml5,11.4,13.69,13.65,10.22,6.22
ml6,9.62,4.46,8.56,10.02,8.11
ml7,13.73,7.17,7.87,12.41,10.49
ml8,9.28,10.72,6.11,13.28,6.84
ml9,9.36,6.61,12.98,11.24,8.26
ml10,11.68,9.57,5.19,13.46,5.52
ml11,4.11,5.17,10.29,12.27,4.49
ml12,11.34,11.53,7.19,5.5,9.21
ml13,11.06,10.12,8.76,9.88,12.57
ml14,6.45,11.45,8.24,4.4,13.03
ml15,4.31,7.77,4.22,8.94,9.65
ml16,13.64,12.29,7.36,4.83,4.9
ml17,12.59,5.88,8.44,6.15,12.41
ml18,9.84,13.34,9.73,11.67,11.44
ml19,13.06,10.98,7.72,7.13,10.87
ml20,4.97,8.88,4.99,5.95,5.69
ml21,12.65,13.72,12.08,7.56,8.88
ml22,9.75,9.69,11.8,6.25,4.64
ml23,5.97,6.47,9.39,10.26,10.37
ml24,11.58,4.77,10.83,13.82,4.33
ml25,8.14,12.14,4.41,10.87,13.77
ml26,10.2,12.45,6.29,10.95,7.18
ml27,6.22,11.86,13.89,7.99,10.91
ml28,10.92,9.38,4.15,7.04,4.11
ml29,12.82,10.41,11.23,8.48,5.79
ml30,10.82,11.26,5.92,10.52,9.43
ml31,5.8,11.69,7.93,6.35,11.9
ml32,12.4,6.3,9.0,8.22,13.07
ml33,6.83,6.84,11.11,5.69,10.17
ml34,10.46,13.43,10.08,9.6,12.02
ml35,7.95,9.86,6.36,12.04,8.62
ml36,10.71,5.61,11.93,11.88,13.31
ml37,8.67,9.13,10.91,9.71,11.63
ml38,7.42,12.77,12.39,5.14,11.71
ml39,8.71,7.43,4.77,5.08,14.0
ml40,4.29,8.21,5.8,9.25,7.83
ml41,8.49,9.99,13.18,4.24,7.7
ml42,4.93,8.3,4.62,13.87,13.53
ml43,12.04,10.65,13.78,7.84,6.14
ml44,4.54,12.95,11.5,12.77,6.56
ml45,7.1,12.08,9.92,4.65,5.22
ml46,11.96,5.37,13.22,13.67,7.45
ml47,7.15,11.06,9.18,7.62,7.96
ml48,13.22,9.26,7.03,11.6,5.27
ml49,4.71,6.75,9.6,4.04,8.75
ml50,7.69,8.51,7.51,12.94,10.05
ml51,6.65,4.87,12.23,6.53,11.29
ml52,13.38,7.57,12.84,11.05,12.6
ml53,7.44,4.16,5.25,5.37,6.67
ml54,6.08,7.07,5.49,8.55,11.09
ml55,10.35,4.06,4.85,4.53,4.18
ml56,5.39,5.5,11.5,9.11,5.01
ml57,7.87,5.76,10.57,8.8,6.98
ml58,13.87,8.63,6.51,7.32,13.58
ml59,13.07,5.02,10.16,13.08,8.38
ml60,5.21,7.91,10.48,12.7,7.38
ml61,8.34,6.07,6.95,5.87,10.65
ml62,10.08,13.19,12.66,9.43,12.79
ml63,8.96,4.52,5.58,6.91,9.82
ml64,5.68,13.91,12.58,11.42,5.93
"""
Flexible Gaussian Process Regression (GPR) Wrapper
=================================================
This module provides a scikit-learn compatible wrapper for
GaussianProcessRegressor that allows dynamic selection and configuration
of kernels during Bayesian optimization.
"""
from sklearn.base import BaseEstimator, RegressorMixin
from sklearn.gaussian_process import GaussianProcessRegressor
from sklearn.gaussian_process.kernels import (
RBF, Matern, RationalQuadratic, WhiteKernel, ConstantKernel
)
class FlexibleGPR(BaseEstimator, RegressorMixin):
"""
A wrapper for GaussianProcessRegressor with a configurable kernel via
hyperparameters optimized through BayesSearchCV.
"""
def __init__(
self,
kernel_type: str = "RBF", # "RBF" | "Matern32" | "Matern52" | "RQ"
amplitude: float = 1.0,
length_scale: float = 1.0,
rq_alpha: float = 1.0,
noise: float = 1e-6,
normalize_y: bool = True,
n_restarts_optimizer: int = 5,
random_state: int = 42,
):
"""
Parameters
----------
kernel_type : str
Type of kernel to use ('RBF', 'Matern32', 'Matern52', 'RQ').
amplitude : float
Amplitude of the constant kernel.
length_scale : float
Length scale of the base kernel.
rq_alpha : float
Alpha parameter for RationalQuadratic kernel.
noise : float
Noise level for WhiteKernel.
normalize_y : bool
Whether to normalize the target values.
n_restarts_optimizer : int
Number of restarts for the optimizer.
random_state : int
Random seed for reproducibility.
"""
self.kernel_type = kernel_type
self.amplitude = amplitude
self.length_scale = length_scale
self.rq_alpha = rq_alpha
self.noise = noise
self.normalize_y = normalize_y
self.n_restarts_optimizer = n_restarts_optimizer
self.random_state = random_state
self.model_ = None
def _build_kernel(self):
# length_scale: can use a scalar (isotropic) or a vector (ARD).
# We start with isotropic for robustness with small datasets.
ls = float(self.length_scale)
if self.kernel_type == "RBF":
base = RBF(length_scale=ls, length_scale_bounds=(1e-6, 1e3))
elif self.kernel_type == "Matern32":
base = Matern(
length_scale=ls, nu=1.5, length_scale_bounds=(1e-6, 1e3)
)
elif self.kernel_type == "Matern52":
base = Matern(
length_scale=ls, nu=2.5, length_scale_bounds=(1e-6, 1e3)
)
elif self.kernel_type == "RQ":
base = RationalQuadratic(
length_scale=ls,
alpha=float(self.rq_alpha),
length_scale_bounds=(1e-6, 1e3),
alpha_bounds=(1e-6, 1e3)
)
else:
raise ValueError(f"Unknown kernel_type: {self.kernel_type}")
k = ConstantKernel(
constant_value=float(self.amplitude),
constant_value_bounds=(1e-6, 1e3)
) * base
k += WhiteKernel(
noise_level=float(self.noise),
noise_level_bounds=(1e-12, 1e-2)
)
return k
def fit(self, x_data, y):
"""
Fit the Gaussian Process model.
Parameters
----------
x_data : array-like of shape (n_samples, n_features)
Training data.
y : array-like of shape (n_samples,) or (n_samples, n_targets)
Target values.
Returns
-------
self : object
Returns self.
"""
kernel = self._build_kernel()
self.model_ = GaussianProcessRegressor(
kernel=kernel,
normalize_y=self.normalize_y,
n_restarts_optimizer=int(self.n_restarts_optimizer),
random_state=self.random_state,
)
self.model_.fit(x_data, y)
return self
def predict(self, x_data):
"""
Predict using the Gaussian Process model.
Parameters
----------
x_data : array-like of shape (n_samples, n_features)
Samples to predict.
Returns
-------
y : array-like of shape (n_samples,) or (n_samples, n_targets)
Predicted values.
"""
return self.model_.predict(x_data)
"""
Flexible Multilayer Perceptron (MLP) Wrapper
===========================================
This module provides a scikit-learn compatible wrapper for MLPRegressor
that allows dynamic configuration of hidden layer architecture.
"""
from sklearn.neural_network import MLPRegressor
from sklearn.base import BaseEstimator, RegressorMixin
class FlexibleMLP(BaseEstimator, RegressorMixin):
"""
Flexible wrapper around sklearn's MLPRegressor to allow dynamic control
over the number of layers and neurons per layer during Bayesian
optimization.
"""
def __init__(self, n_layers=2, n_neurons=128, activation='relu',
alpha=1e-4, learning_rate_init=1e-3,
random_state=42, max_iter=5000):
"""
Parameters
----------
n_layers : int
Number of hidden layers.
n_neurons : int
Number of neurons per hidden layer.
activation : str
Activation function ('relu', 'tanh', etc.).
alpha : float
L2 regularization term (weight decay).
learning_rate_init : float
Initial learning rate.
random_state : int
Random seed for reproducibility.
max_iter : int
Maximum number of training iterations.
"""
self.n_layers = n_layers
self.n_neurons = n_neurons
self.activation = activation
self.alpha = alpha
self.learning_rate_init = learning_rate_init
self.random_state = random_state
self.max_iter = max_iter
self.model_ = None
def fit(self, x_data, y):
"""
Fit the MLP model with the given architecture.
Parameters
----------
x_data : array-like of shape (n_samples, n_features)
Training data.
y : array-like of shape (n_samples,)
Target values.
Returns
-------
self : object
Returns self.
"""
hidden_layers = tuple([self.n_neurons] * self.n_layers)
self.model_ = MLPRegressor(
hidden_layer_sizes=hidden_layers,
activation=self.activation,
alpha=self.alpha,
learning_rate_init=self.learning_rate_init,
solver="adam",
early_stopping=True,
tol=1e-4,
n_iter_no_change=100,
validation_fraction=0.2,
max_iter=self.max_iter,
random_state=self.random_state
)
self.model_.fit(x_data, y)
return self
def predict(self, x_data):
"""
Predict target values for x_data.
Parameters
----------
x_data : array-like of shape (n_samples, n_features)
Samples to predict.
Returns
-------
y : array-like of shape (n_samples,)
Predicted values.
"""
return self.model_.predict(x_data)
"""
Radial Basis Function (RBF) Model Wrapper
========================================
This module provides a simple wrapper for scipy's Rbf interpolation,
facilitating its use as a surrogate model for structural optimization.
"""
import numpy as np
from scipy.interpolate import Rbf
class RBFModel:
"""
Radial Basis Function (RBF) surrogate model wrapper.
This class provides a simple interface to scipy's Rbf interpolation,
allowing for easy fitting and prediction.
"""
def __init__(
self, x_data, y_data, function='multiquadric', smooth=0.0, epsilon=None
):
"""
Initialize and fit the RBF model.
Parameters
----------
x_data : array-like of shape (n_samples, n_features)
Training data inputs.
y_data : array-like of shape (n_samples,)
Training data targets.
function : str, optional
The radial basis function to use (default: 'multiquadric').
smooth : float, optional
Smoothing parameter (default: 0.0).
epsilon : float, optional
Shape parameter (default: None).
"""
self.x_data = np.asarray(x_data)
self.y_data = np.asarray(y_data)
self.function = function
self.smooth = smooth
self.epsilon = epsilon
self.model = None
self.fit()
def fit(self):
"""Fit the RBF model using the stored data."""
self.model = Rbf(
*self.x_data.T, self.y_data,
function=self.function,
smooth=self.smooth,
epsilon=self.epsilon
)
def predict(self, x_new):
"""
Predict target values for new inputs.
Parameters
----------
x_new : array-like of shape (n_samples, n_features)
New input data.
Returns
-------
array-like
Predicted target values.
"""
x_new = np.asarray(x_new)
return self.model(*x_new.T)
"""
Radial Basis Function (RBF) Surrogate Model Training and Validation
==================================================================
This module handles the training of RBF surrogate models and performs
Leave-One-Out (LOO) validation to assess their predictive performance.
It automates the process for multiple output variables.
"""
import os
import argparse
import joblib
import numpy as np
import pandas as pd
from rbf_model import RBFModel
from sklearn.metrics import mean_squared_error, mean_absolute_error, r2_score
def loo_validation_rbf(df_val, input_features, target_features,
function="multiquadric", smooth=0.0, epsilon=None):
"""
Leave-One-Out (LOO) validation for RBF models.
For each output:
- N RBF trainings are performed (one per sample),
- leaving out sample i, predicting point i,
- and accumulating predictions to calculate RMSE, MAE, and R².
Parameters
----------
df_val : pd.DataFrame
Complete DataFrame with tw's and outputs.
input_features : list[str]
Input column names (tw1, tw2, ...).
target_features : list[str]
Target column names (exymax_tw1, tfmmax_tw1, ...).
function, smooth, epsilon : RBF parameters.
Returns
-------
results : pd.DataFrame
Table with LOO RMSE, MAE, and R² for each output.
"""
x_all = df_val[input_features].values
n_samples = x_all.shape[0]
results = []
print("\n=== LOO validation for RBF models ===")
print(f"N samples = {n_samples}\n")
for tgt in target_features:
print(f"→ LOO for output: {tgt}")
y_all = df_val[tgt].values
y_true = []
y_pred = []
for i in range(n_samples):
# Training set: all but i
mask = np.ones(n_samples, dtype=bool)
mask[i] = False
x_train = x_all[mask]
y_train = y_all[mask]
x_test = x_all[~mask].reshape(1, -1)
# Train RBF with N-1 samples
model = RBFModel(
x_train,
y_train,
function=function,
smooth=smooth,
epsilon=epsilon
)
pred_i = model.predict(x_test)[0]
y_true.append(y_all[i])
y_pred.append(pred_i)
y_true = np.asarray(y_true)
y_pred = np.asarray(y_pred)
rmse = np.sqrt(mean_squared_error(y_true, y_pred))
mae = mean_absolute_error(y_true, y_pred)
r2 = r2_score(y_true, y_pred)
results.append({
"output": tgt,
"LOO_RMSE": rmse,
"LOO_MAE": mae,
"LOO_R2": r2
})
print(
f" LOO_RMSE = {rmse:.5f} | LOO_MAE = {mae:.5f} | "
f"LOO_R2 = {r2:.5f}"
)
results_df = pd.DataFrame(results)
return results_df
def main():
"""
Main execution pipeline for RBF surrogate training and validation.
"""
# Configuration constants
parser = argparse.ArgumentParser()
parser.add_argument("--W", type=int, required=True)
parser.add_argument("--B", type=int, required=True)
args = parser.parse_args()
w_val = args.W
b_val = args.B
h_val = None # Initialize height identifier
if w_val == 2:
h_val = 30 # Adjust height identifier for two window case
elif w_val == 3:
h_val = 45 # Adjust height identifier for three window case
elif w_val == 5:
h_val = 60 # Adjust height identifier for five window case
else:
print("Warning: H not set for W != 2, 3, or 5")
# Path configuration
data_path = f"../../data/width_optimization/{w_val}W/FEMdata_B{b_val}_H{h_val}.csv"
output_models_dir = (
f"../../models/width_optimization/models_rbf/{w_val}W/"
f"per_output_models_B{b_val}_H{h_val}"
)
os.makedirs(output_models_dir, exist_ok=True)
print("=== Loading FEM data ===")
df = pd.read_csv(data_path)
df.columns = [c.strip().lower() for c in df.columns]
input_features = [f"tw{i+1}" for i in range(w_val)]
target_features = [c for c in df.columns if c not in input_features]
x_values = df[input_features].values
print(f"Training RBF surrogate models for outputs:\n{target_features}\n")
for tgt in target_features:
print(f"→ Training RBF for {tgt} ...")
y = df[tgt].values
rbf = RBFModel(
x_values, y,
function="multiquadric",
smooth=0.0,
epsilon=None
)
joblib.dump(rbf, os.path.join(output_models_dir, f"rbf_{tgt}.joblib"))
print(f" saved to rbf_{tgt}.joblib")
# =============================================================
# Execute LOO validation
# =============================================================
loo_results = loo_validation_rbf(
df, input_features, target_features,
function="multiquadric", smooth=0.0, epsilon=None
)
loo_csv = os.path.join(output_models_dir, "rbf_LOO_results.csv")
loo_results.to_csv(loo_csv, index=False)
print(f"\nLOO results saved to: {loo_csv}")
if __name__ == "__main__":
main()
#!/bin/bash
python ml_surrogate_train.py --W 2 --B 29
python ml_surrogate_train.py --W 2 --B 34
##python ml_surrogate_train.py --W 3 --B 29
##python ml_surrogate_train.py --W 3 --B 34
##python ml_surrogate_train.py --W 5 --B 34
python rbf_surrogate_train.py --W 2 --B 29
python rbf_surrogate_train.py --W 2 --B 34
##python rbf_surrogate_train.py --W 3 --B 29
##python rbf_surrogate_train.py --W 3 --B 34
##python rbf_surrogate_train.py --W 5 --B 34
python ml_optimization_de.py --W 2 --B 29 --TFD_W 90
python ml_optimization_de.py --W 2 --B 34 --TFD_W 90
##python ml_optimization_de.py --W 3 --B 29 --TFD_W 90
##python ml_optimization_de.py --W 3 --B 34 --TFD_W 90
##python ml_optimization_de.py --W 5 --B 34 --TFD_W 90
python rbf_optimization_de.py --W 2 --B 29 --TFD_W 90
python rbf_optimization_de.py --W 2 --B 34 --TFD_W 90
##python rbf_optimization_de.py --W 3 --B 29 --TFD_W 90
##python rbf_optimization_de.py --W 3 --B 34 --TFD_W 90
##python rbf_optimization_de.py --W 5 --B 34 --TFD_W 90
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment