92 Commits

Author SHA1 Message Date
2e4aa313fa Update Markdown Helper to write json files for tables and yaml 2025-06-24 08:35:55 -07:00
116d5e9734 move-all-but-x-of-each (Day-Helper-2025-06-18) 2025-06-18 17:33:29 -07:00
5ca22b3792 mklink for release 2025-06-14 20:17:40 -07:00
b98d1e480b Second pass with standard get files (Day-Helper-2024-12-17)
mklink for vscode extension
2025-06-14 08:01:01 -07:00
404ffe18f5 Create equipment-automation-framework status and cell-instance-state-image-verb-if (Day-Helper-2025-06-01 and Day-Helper-2025-06-02) 2025-06-08 13:12:38 -07:00
5becc546b2 Create MatchDirectory with Parse (Day-Helper-2025-05-21) 2025-06-08 13:10:24 -07:00
0f9d004122 Updated Live Sync (Day-Helper-2025-05-19) 2025-06-08 13:09:05 -07:00
dd5baba9bc Updated Backup (Day-Helper-2024-12-17) 2025-06-08 13:06:53 -07:00
43527b3356 Selenium (Not Fully Tested)
hyper-text-markup-language-to-portable-document-format (Not Fully Tested)
2025-05-19 10:07:27 -07:00
8ca489d818 ~ over | split 2025-05-17 07:51:53 -07:00
8f22f188a2 WriteNginxFileSystem 2025-04-29 15:12:17 -07:00
d23f802cdb free-file-sync-change-created-date 2025-04-21 10:27:06 -07:00
74e9fc33af process-data-standard-format-to-json update 2025-04-19 08:15:08 -07:00
aa6461c62d Scripts 2025-04-11 20:31:29 -07:00
fad2db46b5 https sync 2025-04-10 09:16:01 -07:00
cc9c5013a9 KumaToGatus 2025-04-04 18:59:31 -07:00
23c0ff9683 Update to 2025-02-18 and 2025-02-19 2025-03-31 19:52:26 -07:00
919279a917 javascript methods for sequence to readable date
c# like java for PI5

Helper 2025-02-19 more updates for Compare
2025-03-26 17:02:35 -07:00
0621d0f07e Updates to Backup method 2025-03-23 15:57:07 -07:00
b74a6a5267 SortCodeMethods
DeleteEmptyDirectories

Remove deprecated tasks

Current Results Move to 1 or 9 ...
2025-03-22 15:37:50 -07:00
3e2fb15db0 Split Process Data Standard Format method into multiple methods 2025-03-18 08:57:43 -07:00
6783621dab #pragma warning disable CA1845, IDE0057
GetInferredCheckDirectory

Process.Start for insiders vscode
2025-03-11 11:00:47 -07:00
38ab4424bc WriteNginxFileSystemDelta
ProcessDataStandardFormatToJson
2025-03-06 14:01:00 -07:00
e89e11dcf6 PocketBaseImportWithDeno 2025-03-01 11:11:50 -07:00
384c83304b PostgresDumpToJson 2025-03-01 09:20:35 -07:00
30931eda9c Dynamic Done Column
Changes for AOT
2025-02-27 14:45:50 -07:00
55adcb69aa Removed Renci.SshNet 2025-02-24 21:25:49 -07:00
d9e394f446 CSV Compare 2025-02-24 17:46:47 -07:00
dd4a16117c Move to Archive 2025-02-19 08:43:45 -07:00
a156ff8dcb Scripts 2025-02-15 10:03:30 -07:00
b771761936 Add option directory
Allow lower case drive letters
2025-02-11 14:23:11 -07:00
264b6319cb Linux AOT and Container 2025-02-09 15:21:13 -07:00
7aada4303e Added RemainingWork and StoryPoints 2025-02-08 09:57:54 -07:00
618fa0d55f ExtractKanban 2025-02-05 16:19:13 -07:00
930963965d Changed trigger for code-insiders 2025-01-29 20:13:41 -07:00
9d612d3d3d Move Bank PDF files to Year Month 2025-01-26 16:02:14 -07:00
3070fee04c Removed GetTaskArgumentsForDayHelper20240623 2025-01-15 08:56:36 -07:00
9f4286e3e9 Directory Renaming 2025-01-14 13:00:52 -07:00
5c08ac222a Format from docker 2025-01-08 16:05:21 +00:00
4b85f8807d Year-Season-Attachments
Updated mklink for new version and directory creation
Epoch Testing
2025-01-06 13:46:00 -07:00
7f1f149eac MoveToDelete 2025-01-01 19:44:01 -07:00
fe524b7536 DeleteFirst testing 2024-12-30 13:49:53 -07:00
fb9289a572 Update Subtasks In Markdown Files
Better ISO support

Only reviewing Files when comparing

Extracted sections from UpdateSubTasksInMarkdownFiles
2024-12-26 14:14:31 -07:00
2361796bbf Compile Warnings 2024-12-14 09:41:18 -07:00
9afc7360b9 Not tested 2024-12-13 22:18:34 -07:00
02c03f2853 Rename by replace 2024-12-12 12:20:56 -07:00
c37771da61 Removed AppSettings layer 2024-12-11 10:14:45 -07:00
09c37aed14 DebugProxyPass dynamic server_name and Trim ; 2024-12-04 16:49:39 -07:00
2f65dd3120 DebugProxyPass dynamic server_name 2024-12-04 16:35:35 -07:00
cf531cff36 ConvertToUTF8 2024-12-04 13:18:46 -07:00
ec98d1e275 Update to Nginx parse for default 2024-12-04 10:13:32 -07:00
ecc1cf3a62 Log Only 2024-11-22 17:51:40 -07:00
446b5587be Not Tested 2024-11-22 17:09:25 -07:00
3812a46667 Rename 2024-11-09 13:38:51 -07:00
dc9327274b Still Testing 2024-11-08 17:59:42 -07:00
68c2a34096 Test ticks 2024-11-06 11:17:09 -07:00
f0ebc5b574 HgCV
CDE
Use of LinkTarget
2024-11-01 17:08:04 -07:00
e26c4ccf31 Add Person 2024-10-20 20:12:35 -07:00
80ca8f98eb Exif Helper 2024-10-20 18:56:57 -07:00
19326df4c6 private record 2024-10-19 09:32:22 -07:00
0ee1846c72 Created Tests 2024-10-13 10:41:24 -07:00
b6d8d4c52f Moved to ADO2024 PI#
Ran SortCodeMethods
2024-10-11 09:15:32 -07:00
3d114918e4 FeatureCheckTag 2024-10-04 18:10:30 -07:00
ac5f1caa23 Fixed AOT Warnings 2024-10-03 12:35:31 -07:00
b8ebd7cfe1 Convert InfinityQS Project Files 2024-10-03 11:18:41 -07:00
03ada95fbb InfinityQS 2024-09-25 14:03:27 -07:00
59231ad9e2 Sync to Distinct Max Iteration Path 2024-09-24 10:25:07 -07:00
dc5a369e55 Debug Proxy Pass 2024-09-16 18:50:47 -07:00
73baddb820 Trim 2024-09-16 16:56:13 -07:00
8db9514c83 CommonMark.NET 2024-09-16 14:41:33 -07:00
6bda42fe67 Sort 2024-09-13 14:41:34 -07:00
ba9b7d8d64 ADO Markdown 2024-09-11 20:49:14 -07:00
5d679ae04c ADO System Parent 2024-09-11 12:24:45 -07:00
d2cc0c0e0b Year C 2024-09-10 18:25:11 -07:00
5eac175e4b Readonly 2024-09-10 17:57:10 -07:00
db24568cb4 MoveFiles to Week of Year 2024-09-10 17:07:46 -07:00
fdb1e32c82 WorkItem more more 2024-09-06 19:58:43 -07:00
9c6740becb WorkItem more 2024-09-06 16:23:27 -07:00
c7bcb4a5ea UpdateIteration 2024-09-04 12:28:01 -07:00
ee7841d9c3 EPP * to . 2024-09-03 10:02:10 -07:00
2d5a61e78f Download only 2024-09-03 09:41:44 -07:00
5d4af32c4d Directory Date 2024-08-29 11:19:57 -07:00
ad3798f246 Second round of MoveWaferCounterToArchive 2024-08-29 09:05:50 -07:00
23eacb54c1 MoveWaferCounterToArchive
ParseKanbn
2024-08-29 08:03:46 -07:00
f87c5a9aa6 ADO Comment update 2024-08-21 13:13:12 -07:00
d8bf4b3ab6 Sort order 2024-08-20 22:38:21 -07:00
b525d29f9c Ready to test MoveFilesWithSleep 2024-08-20 21:00:16 -07:00
fd1ee79e75 Reactor 2024-08-20 20:46:57 -07:00
095618b194 CreatedDate 2024-08-16 11:13:14 -07:00
754cc1ee2b Ready to start loading backlog 2024-08-14 13:51:57 -07:00
29bec0cb9a Update Namespaces 2024-08-09 09:54:22 -07:00
a0699ef634 TryArchiveFiles 2024-08-07 13:29:47 -07:00
166 changed files with 21794 additions and 1359 deletions

1
.gitignore vendored
View File

@ -336,3 +336,4 @@ ASALocalRun/
.extensions-vscode-oss
.extensions-vscode-insiders
.vscode/.UserSecrets/secrets.json
.vscode/.helper

422
.vscode/.json vendored
View File

@ -1,312 +1,154 @@
[
{
"id": "403675d4-631e-40bb-900e-fae36d9c9cdd",
"deviceAssetId": "449501900719.jpg",
"ownerId": "fc9fd5a1-d1b3-4080-a21c-daf9b1c24593",
"deviceId": "Library Import",
"type": "IMAGE",
"originalPath": "/var/snap/immich-distribution/pictures/71/449501900719.jpg",
"previewPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/40/36/403675d4-631e-40bb-900e-fae36d9c9cdd-preview.jpeg",
"fileCreatedAt": "2016-12-02T02:34:23-07:00",
"fileModifiedAt": "2016-12-02T02:34:22-07:00",
"isFavorite": false,
"duration": null,
"thumbnailPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/40/36/403675d4-631e-40bb-900e-fae36d9c9cdd-thumbnail.webp",
"encodedVideoPath": "",
"checksum": "\\x28b46dbf4864b92f18800815cf8145c38d037e92",
"isVisible": true,
"livePhotoVideoId": null,
"updatedAt": "2024-04-25T13:13:20.074314-07:00",
"createdAt": "2024-04-25T10:14:24.253144-07:00",
"isArchived": false,
"originalFileName": "449501900719.jpg",
"sidecarPath": null,
"isReadOnly": true,
"thumbhash": "\\xe5a9090d8257787870788886886877776870760aa9",
"isOffline": false,
"libraryId": "af11ab7c-0782-4b7a-ba8e-fe68cf9a718d",
"isExternal": true,
"deletedAt": null,
"localDateTime": "2016-12-01T19:34:23-07:00",
"stackId": null
"EndLine": 116,
"FirstLine": "private Tuple\u003Cstring, Test[], JsonElement[], List\u003CFileInfo\u003E\u003E GetExtractResult(string reportFullPath, DateTime dateTime)",
"FirstUsedLine": 87,
"Name": "GetExtractResult",
"ParameterCount": 2,
"StartLine": 107
},
{
"id": "11ceb05f-8c94-46cd-9a7e-1c06be5a18b8",
"deviceAssetId": "015516300831.jpg",
"ownerId": "fc9fd5a1-d1b3-4080-a21c-daf9b1c24593",
"deviceId": "Library Import",
"type": "IMAGE",
"originalPath": "/var/snap/immich-distribution/pictures/83/015516300831.jpg",
"previewPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/11/ce/11ceb05f-8c94-46cd-9a7e-1c06be5a18b8-preview.jpeg",
"fileCreatedAt": "2014-05-03T14:44:20-07:00",
"fileModifiedAt": "2014-11-17T11:18:58-07:00",
"isFavorite": false,
"duration": null,
"thumbnailPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/11/ce/11ceb05f-8c94-46cd-9a7e-1c06be5a18b8-thumbnail.webp",
"encodedVideoPath": "",
"checksum": "\\x5b976715bab319b3bdc69d5f337701a062494e0b",
"isVisible": true,
"livePhotoVideoId": null,
"updatedAt": "2024-04-25T13:07:55.048725-07:00",
"createdAt": "2024-04-25T10:14:12.923101-07:00",
"isArchived": false,
"originalFileName": "015516300831.jpg",
"sidecarPath": null,
"isReadOnly": true,
"thumbhash": "\\x5a08120c00771777f87778979877597fbef365",
"isOffline": false,
"libraryId": "af11ab7c-0782-4b7a-ba8e-fe68cf9a718d",
"isExternal": true,
"deletedAt": null,
"localDateTime": "2014-05-03T14:44:20-07:00",
"stackId": null
"EndLine": 478,
"FirstLine": "private void WriteFiles(string reportFullPath, DateTime dateTime)",
"FirstUsedLine": 112,
"Name": "WriteFiles",
"ParameterCount": 2,
"StartLine": 467
},
{
"id": "e8e94a75-2b0c-48f6-b26a-76f5cbe46233",
"deviceAssetId": "985177500821.jpg",
"ownerId": "fc9fd5a1-d1b3-4080-a21c-daf9b1c24593",
"deviceId": "Library Import",
"type": "IMAGE",
"originalPath": "/var/snap/immich-distribution/pictures/82/985177500821.jpg",
"previewPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/e8/e9/e8e94a75-2b0c-48f6-b26a-76f5cbe46233-preview.jpeg",
"fileCreatedAt": "2004-04-28T20:31:40-07:00",
"fileModifiedAt": "2018-05-16T21:41:26.093-07:00",
"isFavorite": false,
"duration": null,
"thumbnailPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/e8/e9/e8e94a75-2b0c-48f6-b26a-76f5cbe46233-thumbnail.webp",
"encodedVideoPath": "",
"checksum": "\\x176b222fa88bc72aaf81031f3b7f73644b178de4",
"isVisible": true,
"livePhotoVideoId": null,
"updatedAt": "2024-04-25T13:07:55.667409-07:00",
"createdAt": "2024-04-25T10:14:12.945414-07:00",
"isArchived": false,
"originalFileName": "985177500821.jpg",
"sidecarPath": null,
"isReadOnly": true,
"thumbhash": "\\xdf07121d0687868f87378788887877887780670789",
"isOffline": false,
"libraryId": "af11ab7c-0782-4b7a-ba8e-fe68cf9a718d",
"isExternal": true,
"deletedAt": null,
"localDateTime": "2004-04-28T20:31:40-07:00",
"stackId": null
"EndLine": 195,
"FirstLine": "private static ReadOnlyCollection\u003CRecord\u003E GetKeyValuePairs(ReadOnlyDictionary\u003Cint, WorkItem\u003E keyValuePairs, WorkItem workItem, List\u003Cbool\u003E nests)",
"FirstUsedLine": 132,
"Name": "GetKeyValuePairs",
"ParameterCount": 3,
"StartLine": 159
},
{
"id": "4091bebd-4c26-4d30-bd3a-f2160a54b451",
"deviceAssetId": "956694610829.jpg",
"ownerId": "fc9fd5a1-d1b3-4080-a21c-daf9b1c24593",
"deviceId": "Library Import",
"type": "IMAGE",
"originalPath": "/var/snap/immich-distribution/pictures/82/956694610829.jpg",
"previewPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/40/91/4091bebd-4c26-4d30-bd3a-f2160a54b451-preview.jpeg",
"fileCreatedAt": "2010-07-05T09:10:13.2-07:00",
"fileModifiedAt": "2010-07-05T08:10:12-07:00",
"isFavorite": false,
"duration": null,
"thumbnailPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/40/91/4091bebd-4c26-4d30-bd3a-f2160a54b451-thumbnail.webp",
"encodedVideoPath": "",
"checksum": "\\xc2eb5667d6da5ead1be71c51064ea293ad413ea6",
"isVisible": true,
"livePhotoVideoId": null,
"updatedAt": "2024-04-25T13:07:56.364375-07:00",
"createdAt": "2024-04-25T10:14:12.976197-07:00",
"isArchived": false,
"originalFileName": "956694610829.jpg",
"sidecarPath": null,
"isReadOnly": true,
"thumbhash": "\\x12080a0d82668886808887867877877867807906b7",
"isOffline": false,
"libraryId": "af11ab7c-0782-4b7a-ba8e-fe68cf9a718d",
"isExternal": true,
"deletedAt": null,
"localDateTime": "2010-07-05T09:10:13.2-07:00",
"stackId": null
"EndLine": 158,
"FirstLine": "private static int? GetIdFromUrlIfChild(Relation relation)",
"FirstUsedLine": 173,
"Name": "GetIdFromUrlIfChild",
"ParameterCount": 1,
"StartLine": 143
},
{
"id": "c7bf1944-9f71-4808-8ff9-b0f972e907b0",
"deviceAssetId": "948800300821.jpg",
"ownerId": "fc9fd5a1-d1b3-4080-a21c-daf9b1c24593",
"deviceId": "Library Import",
"type": "IMAGE",
"originalPath": "/var/snap/immich-distribution/pictures/82/948800300821.jpg",
"previewPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/c7/bf/c7bf1944-9f71-4808-8ff9-b0f972e907b0-preview.jpeg",
"fileCreatedAt": "2009-10-09T05:35:00.2-07:00",
"fileModifiedAt": "2009-10-09T04:35:00-07:00",
"isFavorite": false,
"duration": null,
"thumbnailPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/c7/bf/c7bf1944-9f71-4808-8ff9-b0f972e907b0-thumbnail.webp",
"encodedVideoPath": "",
"checksum": "\\x3c5c87ab7e442d1f7a0f2a12678c1d6be00dbc7b",
"isVisible": true,
"livePhotoVideoId": null,
"updatedAt": "2024-04-25T13:07:56.553262-07:00",
"createdAt": "2024-04-25T10:14:12.982686-07:00",
"isArchived": false,
"originalFileName": "948800300821.jpg",
"sidecarPath": null,
"isReadOnly": true,
"thumbhash": "\\x103806258e02bd47937779a478997768fd3bcb9fa4",
"isOffline": false,
"libraryId": "af11ab7c-0782-4b7a-ba8e-fe68cf9a718d",
"isExternal": true,
"deletedAt": null,
"localDateTime": "2009-10-09T05:35:00.2-07:00",
"stackId": null
"EndLine": 271,
"FirstLine": "private static void AppendLines(List\u003Cchar\u003E spaces, List\u003Cstring\u003E lines, Record record, bool condensed, bool sprintOnly)",
"FirstUsedLine": 199,
"Name": "AppendLines",
"ParameterCount": 5,
"StartLine": 257
},
{
"id": "4f5ea703-47e9-48c6-9366-0cc10630dac2",
"deviceAssetId": "898525300821.jpg",
"ownerId": "fc9fd5a1-d1b3-4080-a21c-daf9b1c24593",
"deviceId": "Library Import",
"type": "IMAGE",
"originalPath": "/var/snap/immich-distribution/pictures/82/898525300821.jpg",
"previewPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/4f/5e/4f5ea703-47e9-48c6-9366-0cc10630dac2-preview.jpeg",
"fileCreatedAt": "2020-12-25T08:35:04.92-07:00",
"fileModifiedAt": "2020-12-25T08:35:04-07:00",
"isFavorite": false,
"duration": null,
"thumbnailPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/4f/5e/4f5ea703-47e9-48c6-9366-0cc10630dac2-thumbnail.webp",
"encodedVideoPath": "",
"checksum": "\\xf706263e450c9a26feaeba2dd14fe0fd8f22e623",
"isVisible": true,
"livePhotoVideoId": null,
"updatedAt": "2024-04-25T13:07:58.530848-07:00",
"createdAt": "2024-04-25T10:14:13.048275-07:00",
"isArchived": false,
"originalFileName": "898525300821.jpg",
"sidecarPath": null,
"isReadOnly": true,
"thumbhash": "\\x5518060d8208976849959a99687678687f8dae48f6",
"isOffline": false,
"libraryId": "af11ab7c-0782-4b7a-ba8e-fe68cf9a718d",
"isExternal": true,
"deletedAt": null,
"localDateTime": "2020-12-25T01:35:04.92-07:00",
"stackId": null
"EndLine": 246,
"FirstLine": "private static void AppendLines(string url, List\u003Cchar\u003E spaces, List\u003Cstring\u003E lines, ReadOnlyCollection\u003CRecord\u003E records, string workItemType)",
"FirstUsedLine": 217,
"Name": "AppendLines",
"ParameterCount": 5,
"StartLine": 199
},
{
"id": "86c813ad-2a1c-489f-8fc2-0b76a21889c0",
"deviceAssetId": "864710800829.jpg",
"ownerId": "fc9fd5a1-d1b3-4080-a21c-daf9b1c24593",
"deviceId": "Library Import",
"type": "IMAGE",
"originalPath": "/var/snap/immich-distribution/pictures/82/864710800829.jpg",
"previewPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/86/c8/86c813ad-2a1c-489f-8fc2-0b76a21889c0-preview.jpeg",
"fileCreatedAt": "2004-04-28T20:00:46-07:00",
"fileModifiedAt": "2004-04-28T19:00:46-07:00",
"isFavorite": false,
"duration": null,
"thumbnailPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/86/c8/86c813ad-2a1c-489f-8fc2-0b76a21889c0-thumbnail.webp",
"encodedVideoPath": "",
"checksum": "\\x9061edbf75f11526cef2c832ba339267509eaec4",
"isVisible": true,
"livePhotoVideoId": null,
"updatedAt": "2024-04-25T13:07:59.171233-07:00",
"createdAt": "2024-04-25T10:14:13.078169-07:00",
"isArchived": false,
"originalFileName": "864710800829.jpg",
"sidecarPath": null,
"isReadOnly": true,
"thumbhash": "\\x140812250674874f87777669788778887a93a0470a",
"isOffline": false,
"libraryId": "af11ab7c-0782-4b7a-ba8e-fe68cf9a718d",
"isExternal": true,
"deletedAt": null,
"localDateTime": "2004-04-28T20:00:46-07:00",
"stackId": null
"EndLine": 198,
"FirstLine": "private static string GetClosed(WorkItem workItem) =\u003E",
"FirstUsedLine": 250,
"Name": "GetClosed",
"ParameterCount": 1,
"StartLine": 196
},
{
"id": "b65121d8-4a74-4f27-9d6f-c582ffc444dc",
"deviceAssetId": "862274900829.jpg",
"ownerId": "fc9fd5a1-d1b3-4080-a21c-daf9b1c24593",
"deviceId": "Library Import",
"type": "IMAGE",
"originalPath": "/var/snap/immich-distribution/pictures/82/862274900829.jpg",
"previewPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/b6/51/b65121d8-4a74-4f27-9d6f-c582ffc444dc-preview.jpeg",
"fileCreatedAt": "2018-08-17T22:50:55.15-07:00",
"fileModifiedAt": "2022-11-03T20:25:09.161-07:00",
"isFavorite": false,
"duration": null,
"thumbnailPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/b6/51/b65121d8-4a74-4f27-9d6f-c582ffc444dc-thumbnail.webp",
"encodedVideoPath": "",
"checksum": "\\xd4f623e97acd727868fe0e191c170e449d4456a5",
"isVisible": true,
"livePhotoVideoId": null,
"updatedAt": "2024-04-25T13:07:59.640678-07:00",
"createdAt": "2024-04-25T10:14:13.087927-07:00",
"isArchived": false,
"originalFileName": "862274900829.jpg",
"sidecarPath": null,
"isReadOnly": true,
"thumbhash": "\\x99180a0d045977a077687887777678876a806b0867",
"isOffline": false,
"libraryId": "af11ab7c-0782-4b7a-ba8e-fe68cf9a718d",
"isExternal": true,
"deletedAt": null,
"localDateTime": "2018-08-17T15:50:55.15-07:00",
"stackId": null
"EndLine": 256,
"FirstLine": "private static string GetLine(List\u003Cchar\u003E spaces, WorkItem workItem, Record record, bool condensed, bool sprintOnly)",
"FirstUsedLine": 265,
"Name": "GetLine",
"ParameterCount": 5,
"StartLine": 247
},
{
"id": "09fa281c-b828-47f6-8fbb-a5856edb63b5",
"deviceAssetId": "840656100829.jpg",
"ownerId": "fc9fd5a1-d1b3-4080-a21c-daf9b1c24593",
"deviceId": "Library Import",
"type": "IMAGE",
"originalPath": "/var/snap/immich-distribution/pictures/82/840656100829.jpg",
"previewPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/09/fa/09fa281c-b828-47f6-8fbb-a5856edb63b5-preview.jpeg",
"fileCreatedAt": "2019-05-30T14:56:36.82-07:00",
"fileModifiedAt": "2019-05-30T14:56:36-07:00",
"isFavorite": false,
"duration": null,
"thumbnailPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/09/fa/09fa281c-b828-47f6-8fbb-a5856edb63b5-thumbnail.webp",
"encodedVideoPath": "",
"checksum": "\\xd215606441cefcc295130262bad9fed96d9ac40e",
"isVisible": true,
"livePhotoVideoId": null,
"updatedAt": "2024-04-25T13:08:00.211274-07:00",
"createdAt": "2024-04-25T10:14:13.104556-07:00",
"isArchived": false,
"originalFileName": "840656100829.jpg",
"sidecarPath": null,
"isReadOnly": true,
"thumbhash": "\\xd6070a0d826f62873c788799993a7777137f679058",
"isOffline": false,
"libraryId": "af11ab7c-0782-4b7a-ba8e-fe68cf9a718d",
"isExternal": true,
"deletedAt": null,
"localDateTime": "2019-05-30T07:56:36.82-07:00",
"stackId": null
"EndLine": 435,
"FirstLine": "private static void WriteFiles(string destinationDirectory, ReadOnlyCollection\u003Cstring\u003E lines, ReadOnlyCollection\u003CWorkItem\u003E workItems, string fileName)",
"FirstUsedLine": 272,
"Name": "WriteFiles",
"ParameterCount": 4,
"StartLine": 417
},
{
"id": "8c239624-2bea-479d-b7fa-9f2cd5ebc9b7",
"deviceAssetId": "812813100821.jpg",
"ownerId": "fc9fd5a1-d1b3-4080-a21c-daf9b1c24593",
"deviceId": "Library Import",
"type": "IMAGE",
"originalPath": "/var/snap/immich-distribution/pictures/82/812813100821.jpg",
"previewPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/8c/23/8c239624-2bea-479d-b7fa-9f2cd5ebc9b7-preview.jpeg",
"fileCreatedAt": "2021-08-28T16:29:07.65-07:00",
"fileModifiedAt": "2021-08-28T16:29:08-07:00",
"isFavorite": false,
"duration": null,
"thumbnailPath": "/var/snap/immich-distribution/common/upload/thumbs/fc9fd5a1-d1b3-4080-a21c-daf9b1c24593/8c/23/8c239624-2bea-479d-b7fa-9f2cd5ebc9b7-thumbnail.webp",
"encodedVideoPath": "",
"checksum": "\\x25a9ffe84298f0e3e7151aaf2eb339908574c035",
"isVisible": true,
"livePhotoVideoId": null,
"updatedAt": "2024-04-25T13:08:00.918411-07:00",
"createdAt": "2024-04-25T10:14:13.134038-07:00",
"isArchived": false,
"originalFileName": "812813100821.jpg",
"sidecarPath": null,
"isReadOnly": true,
"thumbhash": "\\x21080e0d825878767f9678bf7747799612c3b0308a",
"isOffline": false,
"libraryId": "af11ab7c-0782-4b7a-ba8e-fe68cf9a718d",
"isExternal": true,
"deletedAt": null,
"localDateTime": "2021-08-28T09:29:07.65-07:00",
"stackId": null
"EndLine": 445,
"FirstLine": "private static ReadOnlyDictionary\u003Cint, Record\u003E GetWorkItems(ReadOnlyCollection\u003CWorkItem\u003E workItems)",
"FirstUsedLine": 277,
"Name": "GetWorkItems",
"ParameterCount": 1,
"StartLine": 436
},
{
"EndLine": 457,
"FirstLine": "private static void WriteFileStructure(string destinationDirectory, ReadOnlyDictionary\u003Cint, Record\u003E keyValuePairs)",
"FirstUsedLine": 278,
"Name": "WriteFileStructure",
"ParameterCount": 2,
"StartLine": 446
},
{
"EndLine": 466,
"FirstLine": "private static void WriteFiles(string destinationDirectory, ReadOnlyCollection\u003CRecord\u003E records, string fileName)",
"FirstUsedLine": 284,
"Name": "WriteFiles",
"ParameterCount": 3,
"StartLine": 458
},
{
"EndLine": 335,
"FirstLine": "private static void WriteFiles(FileConnectorConfiguration fileConnectorConfiguration, string url, ReadOnlyCollection\u003Cstring\u003E workItemTypes, ReadOnlyCollection\u003CWorkItem\u003E workItems)",
"FirstUsedLine": 292,
"Name": "WriteFiles",
"ParameterCount": 4,
"StartLine": 272
},
{
"EndLine": 346,
"FirstLine": "private static void FilterChildren(ReadOnlyCollection\u003Cstring\u003E workItemTypes, Record record, List\u003CWorkItem\u003E results)",
"FirstUsedLine": 343,
"Name": "FilterChildren",
"ParameterCount": 3,
"StartLine": 336
},
{
"EndLine": 389,
"FirstLine": "private static ReadOnlyCollection\u003Cstring\u003E GetChildrenDirectories(ReadOnlyDictionary\u003Cint, Record\u003E keyValuePairs, List\u003Cbool\u003E nests, string parentDirectory, Record record)",
"FirstUsedLine": 384,
"Name": "GetChildrenDirectories",
"ParameterCount": 4,
"StartLine": 365
},
{
"EndLine": 142,
"FirstLine": "private static ReadOnlyDictionary\u003Cint, Record\u003E GetKeyValuePairs(ReadOnlyDictionary\u003Cint, WorkItem\u003E keyValuePairs)",
"FirstUsedLine": 442,
"Name": "GetKeyValuePairs",
"ParameterCount": 1,
"StartLine": 117
},
{
"EndLine": 416,
"FirstLine": "private static ReadOnlyCollection\u003Cstring\u003E GetDirectories(string destinationDirectory, ReadOnlyDictionary\u003Cint, Record\u003E keyValuePairs)",
"FirstUsedLine": 448,
"Name": "GetDirectories",
"ParameterCount": 2,
"StartLine": 390
},
{
"EndLine": 353,
"FirstLine": "private static ReadOnlyCollection\u003CWorkItem\u003E FilterChildren(ReadOnlyCollection\u003Cstring\u003E workItemTypes, Record record)",
"FirstUsedLine": 506,
"Name": "FilterChildren",
"ParameterCount": 2,
"StartLine": 347
},
{
"EndLine": 364,
"FirstLine": "private static int GetState(WorkItem workItem) =\u003E",
"FirstUsedLine": 628,
"Name": "GetState",
"ParameterCount": 1,
"StartLine": 354
}
]

27
.vscode/download-work-items.http vendored Normal file
View File

@ -0,0 +1,27 @@
@host = https://tfs.intra.infineon.com
@pat = asdf
@ids = 126018, 224543
GET {{host}}/tfs/FactoryIntegration/_apis/wit/workitems?ids={{ids}}&$expand=Relations
Accept: application/json
Authorization: Basic {{pat}}
###
GET {{host}}/tfs/FactoryIntegration/_apis/wit/workitems/{{ids}}/updates
Accept: application/json
Authorization: Basic {{pat}}
### Iterations
GET {{host}}/tfs/FactoryIntegration/ART%20SPS/cea9f426-6fb1-4d65-93d5-dbf471056212/_apis/work/teamsettings/iterations?
Accept: application/json
Authorization: Basic {{pat}}
###
DELETE http://localhost:5004/api/SyncV1/?size=4&ticks=638796666663591762&path=d:\Tmp\phares\VisualStudioCode\z-include-patterns - Copy.nsv
###
GET http://localhost:5004/api/SyncV1/?size=4&ticks=638796666663591762&path=d:\Tmp\phares\VisualStudioCode\z-include-patterns - Copy.nsv

View File

@ -0,0 +1,4 @@
@host = eaf-prod.mes.infineon.com:9003
POST {{host}}/StatusQuery
Accept: application/json

136
.vscode/launch.json vendored
View File

@ -11,18 +11,108 @@
"preLaunchTask": "build",
"program": "${workspaceFolder}/bin/Debug/net8.0/win-x64/File-Folder-Helper.dll",
"args": [
"s",
"M",
"D:/5-Other-Small/Notes/EC-Documentation",
"-d",
"D:/5-Other-Small/Notes/EC-Documentation/.vscode/helper",
"s",
"X",
"D:/5-Other-Small/Free-File-Sync",
"Day-Helper-2024-08-05",
"*.ffs_gui",
"lines.md",
"C:/Users/phares/AppData/Roaming/FreeFileSync/GlobalSettings.xml",
"5555",
"6666",
"7777",
"8888",
"9999"
"D:/5-Other-Small/Proxmox/DiskInfo",
"Day-Helper-2025-06-18",
"*.json",
"D:/5-Other-Small/Proxmox/Disk-Info-Old",
"-2025-",
"1",
"s",
"X",
"D:/Tmp",
"Day-Helper-2025-06-02",
"infineon\\MESPhares",
"BACKLOG~BIORAD2~BIORAD3~BIORAD4~BIORAD5~CDE4~CDE5~CDE6~DEP08CEPIEPSILON~DEP08SIASM~DEP08SIHTRPLC~EC~HGCV1~HGCV2~HGCV3~MESAFIBACKLOG~MET06AWCT~MET08ANLYSDIFAAST230~MET08AWCT~MET08DDUPSFS6420~MET08DDUPSP1TBI~MET08RESIHGCV~MET08RESIMAPCDE~MET08RESISRP2100~MET08THFTIRQS408M~MET08THFTIRSTRATUS~METCLIMATEC~R29~R32~R36~R47~R55~R57~R61~R62~R65~R70~R72~R73~R74~R75~R77~SP101~SPV01~SRP~TENCOR1~TENCOR2~TENCOR3~TRENDLOG~WC6INCH1~WC6INCH2~WC6INCH3~WC6INCH4~WC8INCH1~WC8INCH2~WC8INCH3",
"s",
"X",
"D:/5-Other-Small/Proxmox/ffnm",
"Day-Helper-2025-05-21",
"*.pdf",
"*.md",
"2",
"MM-dd-yy",
"Trans Date~Effective Date~Description~Withdrawal Deposit~Balance",
"s",
"X",
"D:/Tmp/phares/VisualStudioCode",
"Day-Helper-2025-05-19",
"D:/Tmp/phares/VisualStudioCode/.vscode/input.json",
"s",
"X",
"D:/Tmp/phares/VisualStudioCode",
"Day-Helper-2025-05-19",
"D:/Tmp/phares/VisualStudioCodeLeft",
"z-include-patterns.nsv",
"z-exclude-patterns.nsv",
"http://localhost:5004",
"/api/SyncV1/?",
",L",
".G",
"+~G~~L~+~Custom-Default",
"",
"+~G~~G~-~Mirror",
"+~G~~~~Update",
"+~G~~L~+~Custom-Default",
"-~G~~G~+~Custom-A",
"-~L~~L~+~Custom-B",
"+~L~~L~-~Custom-C",
"s",
"X",
"\\\\mesfs.infineon.com\\EC_Characterization_Si\\Archive\\BIORAD4\\2025_Week_16\\2025-04-17",
"Day-Helper-2025-02-19",
"csv-*.pdsf",
"*.pdsf",
"Time,HeaderUniqueId,UniqueId,Date,Wafer,Position,BIORAD4",
",BIORAD4",
",BIORAD4",
"Test|EventId,Date|DateTime,Position|Slot,DeltaThicknessSlotsOneAndTwentyFive|Actual Delta Thick Pts 1 and 25,PercentDeltaThicknessSlotsOneAndTwentyFive|% Delta Thick Pts 1 and 25,MID|Cassette,Lot|Batch,Title|Batch,Wafer|Text,Thickness|Site,MeanThickness|GradeMean,|BIORAD4",
"Time,A_LOGISTICS,B_LOGISTICS,Test,Count,Index,MesEntity,MID,Date,Employee,Lot,PSN,Reactor,Recipe,Cassette,GradeStdDev,HeaderUniqueId,Layer,MeanThickness,PassFail,RDS,Slot,Title,UniqueId,Wafer,Zone,Mean,Position,StdDev,Thickness,ThicknessSlotOne,ThicknessSlotTwentyFive,DeltaThicknessSlotsOneAndTwentyFive,PercentDeltaThicknessSlotsOneAndTwentyFive",
"Time,A_LOGISTICS,B_LOGISTICS,Count,Sequence,MesEntity,Index,Batch,Cassette,DateTime,Destination,Mean,PassFail,Recipe,Reference,Site,Slot,Source,StdDev,Text,GradeMean,GradeStdDev,RDS,PSN,Reactor,Layer,Zone,Employee,InferredLot,Thickness First Slot,Thickness Last Slot,Actual Delta Thick Pts 1 and 25,% Delta Thick Pts 1 and 25,EventId",
"0,1,2,31,3,6,5,8,9,27,7,23,24,13,8,21,-1,25,20,12,22,16,7,-1,19,26,11,16,18,15,-1,-1,29,30",
"s",
"X",
"C:/Users/phares/AppData/Roaming/FreeFileSync",
"Day-Helper-2025-04-21",
"GlobalSettings.xml",
"LastSync|Config",
"s",
"X",
"L:/Tmp/MET08ANLYSDIFAAST230",
"Day-Helper-2025-03-06",
"*.pdsf",
"s",
"X",
"D:/ProgramData/VisualStudioCode|D:/6-Other-Large-Z/Linux-Ubuntu-Phares/home/lphares/dorico",
"Day-Helper-2025-04-07",
"z-include-patterns.nsv",
"z-exclude-patterns.nsv",
"https://isccvm57294f1ed/VisualStudioCode|hxttps://dorico.phares.duckdns.org|hxttps://mestsa006.infineon.com/VisualStudioCode",
"+|G|G|G|-",
"||||",
"666",
"777",
"888",
"999",
"s",
"X",
"C:/Users/PHARES/AppData/Local/IFXApps/gatus",
"Day-Helper-2025-04-04",
"*.json",
".metrics",
"https://messa010ec.infineon.com/metrics",
"gatus_results_endpoint_success",
"666",
"777",
"888",
"999",
""
],
"cwd": "${workspaceFolder}",
"console": "integratedTerminal",
@ -32,6 +122,32 @@
"name": ".NET Core Attach",
"type": "coreclr",
"request": "attach"
},
{
"type": "node",
"request": "launch",
"name": "node Launch Current Opened File",
"program": "${file}"
},
{
"type": "bun",
"internalConsoleOptions": "neverOpen",
"request": "launch",
"name": "Debug File",
"program": "${file}",
"cwd": "${workspaceFolder}",
"stopOnEntry": false,
"watchMode": false
},
{
"type": "bun",
"internalConsoleOptions": "neverOpen",
"request": "launch",
"name": "Run File",
"program": "${file}",
"cwd": "${workspaceFolder}",
"noDebug": true,
"watchMode": false
}
]
}

25
.vscode/mklink.md vendored
View File

@ -14,15 +14,28 @@ mklink /J "L:\DevOps\Mesa_FI\File-Folder-Helper\.kanbn" "D:\5-Other-Small\Kanban
mklink /J "L:\DevOps\Mesa_FI\File-Folder-Helper\.kanbn" "D:\5-Other-Small\Kanban\File-Folder-Helper"
```
```bash
```bash Thu Jul 18 2024 13:47:40 GMT-0700 (Mountain Standard Time)
mklink /J "L:\DevOps\Mesa_FI\File-Folder-Helper\.vscode\.UserSecrets" "C:\Users\phares\AppData\Roaming\Microsoft\UserSecrets\8da397d4-13ec-4576-9722-3c79cad25563"
```
```bash 1749414316830 = 638850111168300000 = 2025-2.Spring = Sun Jun 08 2025 13:25:16 GMT-0700 (Mountain Standard Time)
C:\Users\PHARES\.vscode\extensions\infineon-technologies-ag-mesa-fi.infineon-technologies-ag-mesa-fi-cost-of-delay-helper-1.124.0
del "L:\DevOps\Mesa_FI\File-Folder-Helper\.extensions-vscode"
del "L:\DevOps\Mesa_FI\File-Folder-Helper\.extensions-vscode-oss"
del "L:\DevOps\Mesa_FI\File-Folder-Helper\.extensions-vscode-insiders"
mklink /J "L:\DevOps\Mesa_FI\File-Folder-Helper\.extensions-vscode" "C:\Users\phares\.vscode\extensions\ifx.type-script-helper-1.6.3"
mklink /J "L:\DevOps\Mesa_FI\File-Folder-Helper\.extensions-vscode-oss" "C:\Users\phares\.vscode-oss\extensions\ifx.type-script-helper-1.6.3"
mklink /J "L:\DevOps\Mesa_FI\File-Folder-Helper\.extensions-vscode-insiders" "C:\Users\phares\.vscode-insiders\extensions\ifx.type-script-helper-1.6.3"
mkdir "C:\Users\phares\.vscode\extensions\infineon-technologies-ag-mesa-fi.infineon-technologies-ag-mesa-fi-cost-of-delay-helper-1.124.0\net8.0\win-x64"
mkdir "C:\Users\phares\.vscode-oss\extensions\infineon-technologies-ag-mesa-fi.infineon-technologies-ag-mesa-fi-cost-of-delay-helper-1.124.0\net8.0\win-x64"
mkdir "C:\Users\phares\.vscode-insiders\extensions\infineon-technologies-ag-mesa-fi.infineon-technologies-ag-mesa-fi-cost-of-delay-helper-1.124.0\net8.0\win-x64"
mklink /J "C:\Users\phares\.vscode\extensions\infineon-technologies-ag-mesa-fi.infineon-technologies-ag-mesa-fi-cost-of-delay-helper-1.124.0\net8.0\win-x64\publish" "L:\DevOps\Mesa_FI\File-Folder-Helper\bin\Release\net8.0\win-x64\publish"
mklink /J "C:\Users\phares\.vscode-oss\extensions\infineon-technologies-ag-mesa-fi.infineon-technologies-ag-mesa-fi-cost-of-delay-helper-1.124.0\net8.0\win-x64\publish" "L:\DevOps\Mesa_FI\File-Folder-Helper\bin\Release\net8.0\win-x64\publish"
mklink /J "C:\Users\phares\.vscode-insiders\extensions\infineon-technologies-ag-mesa-fi.infineon-technologies-ag-mesa-fi-cost-of-delay-helper-1.124.0\net8.0\win-x64\publish" "L:\DevOps\Mesa_FI\File-Folder-Helper\bin\Release\net8.0\win-x64\publish"
```
```bash Thu Jul 18 2024 13:47:40 GMT-0700 (Mountain Standard Time)
mklink /J "L:\DevOps\Mesa_FI\File-Folder-Helper\.vscode\.UserSecrets" "L:\Git\Notes-User-Secrets\.UserSecrets\8da397d4-13ec-4576-9722-3c79cad25563"
```bash 1749957317559 = 638855541175590000 = 2025-2.Spring = Sat Jun 14 2025 20:15:17 GMT-0700 (Mountain Standard Time)
mkdir "L:\DevOps\MESA_FI\file-folder-helper\bin\Release\net8.0\win-x64"
mklink /J "L:\DevOps\MESA_FI\file-folder-helper\bin\Release\net8.0\win-x64\publish" "D:\5-Other-Small\Proxmox\publish"
```
```bash 1750459968132 = 638860567681320000 = 2025-3.Summer = Fri Jun 20 2025 15:52:47 GMT-0700 (Mountain Standard Time)
mklink /J "L:\DevOps\Mesa_FI\File-Folder-Helper\.vscode\.helper" "D:\5-Other-Small\Notes\Infineon\.vscode\helper"
```

21
.vscode/settings.json vendored
View File

@ -10,14 +10,20 @@
"**/node_modules": true
},
"cSpell.words": [
"abcdefghiklmnopqrstuvwxyz",
"Acks",
"ASPNETCORE",
"BIORAD",
"BIRT",
"CHIL",
"DEAT",
"endianness",
"Exif",
"FAMC",
"FAMS",
"Gatus",
"GIVN",
"HGCV",
"HUSB",
"Immich",
"INDI",
@ -25,17 +31,30 @@
"Kanban",
"kanbn",
"Kofax",
"Linc",
"mesfs",
"mestsa",
"netrm",
"NpgSql",
"NSFX",
"OBJE",
"onenote",
"PDFC",
"pdsf",
"Permyriad",
"pged",
"Phares",
"Renci",
"Reparse",
"Rijndael",
"Serilog",
"startable",
"SUBM",
"SURN",
"SYSLIB"
"SYSLIB",
"TENCOR",
"VSTS",
"WIQL",
"WSJF"
]
}

158
.vscode/tasks.json vendored
View File

@ -58,12 +58,95 @@
"type": "process",
"args": [
"build",
"-r",
"win-x64",
"${workspaceFolder}/File-Folder-Helper.csproj",
"/property:GenerateFullPaths=true",
"/consoleloggerparameters:NoSummary"
],
"problemMatcher": "$msCompile"
},
{
"label": "build Linux",
"command": "dotnet",
"type": "process",
"args": [
"build",
"-r",
"linux-x64",
"${workspaceFolder}/File-Folder-Helper.csproj",
"/property:GenerateFullPaths=true",
"/consoleloggerparameters:NoSummary"
],
"problemMatcher": "$msCompile"
},
{
"label": "podmanLogin",
"command": "podman",
"type": "process",
"args": [
"login",
"gitea.phares.duckdns.org:443"
],
"problemMatcher": "$msCompile"
},
{
"label": "podmanBuild",
"command": "podman",
"type": "process",
"args": [
"build",
"-t",
"file-folder-helper",
"."
],
"problemMatcher": "$msCompile"
},
{
"label": "podmanImageList",
"command": "podman",
"type": "process",
"args": [
"image",
"ls"
],
"problemMatcher": "$msCompile"
},
{
"label": "podmanRun",
"command": "podman",
"type": "process",
"args": [
"run",
"-p",
"5001:5001",
"--name",
"file-folder-helper-001",
"a3de856b5731"
],
"problemMatcher": "$msCompile"
},
{
"label": "podmanTag",
"command": "podman",
"type": "process",
"args": [
"tag",
"a3de856b5731",
"gitea.phares.duckdns.org:443/phares3757/file-folder-helper:latest"
],
"problemMatcher": "$msCompile"
},
{
"label": "podmanPush",
"command": "podman",
"type": "process",
"args": [
"push",
"gitea.phares.duckdns.org:443/phares3757/file-folder-helper:latest"
],
"problemMatcher": "$msCompile"
},
{
"label": "publish",
"command": "dotnet",
@ -106,50 +189,21 @@
"problemMatcher": "$msCompile"
},
{
"label": "File-Folder-Helper AOT s J Verdaccio",
"type": "shell",
"command": "L:/DevOps/Mesa_FI/File-Folder-Helper/bin/Release/net8.0/win-x64/publish/File-Folder-Helper.exe",
"label": "Publish AOT Linux",
"command": "dotnet",
"type": "process",
"args": [
"s",
"J",
"L:/Verdaccio/storage",
"publish",
"-r",
"linux-x64",
"-c",
"Release",
"-p:PublishAot=true",
"${workspaceFolder}/File-Folder-Helper.csproj",
"/property:GenerateFullPaths=true",
"/consoleloggerparameters:NoSummary"
],
"problemMatcher": []
},
{
"label": "File-Folder-Helper AOT s S BaGet",
"type": "shell",
"command": "L:/DevOps/Mesa_FI/File-Folder-Helper/bin/Release/net8.0/win-x64/publish/File-Folder-Helper.exe",
"args": [
"s",
"S",
"L:/BaGet/packages",
],
"problemMatcher": []
},
{
"label": "File-Folder-Helper AOT s X SortCodeMethods",
"type": "shell",
"command": "L:/DevOps/Mesa_FI/File-Folder-Helper/bin/Release/net8.0/win-x64/publish/File-Folder-Helper.exe",
"args": [
"s",
"X",
"L:/DevOps/Mesa_FI/File-Folder-Helper",
"Day-Helper-2024-01-08",
"L:/DevOps/Mesa_FI/File-Folder-Helper/Helpers"
],
"problemMatcher": []
},
{
"label": "File-Folder-Helper AOT s F Staging _Logs",
"type": "shell",
"command": "L:/DevOps/Mesa_FI/File-Folder-Helper/bin/Release/net8.0/win-x64/publish/File-Folder-Helper.exe",
"args": [
"s",
"F",
"'\\\\messv02ecc1.ec.local\\EC_EAFLog\\Staging\\_ Logs'",
],
"problemMatcher": []
"problemMatcher": "$msCompile"
},
{
"label": "Kanbn Console",
@ -168,6 +222,26 @@
"type": "npm",
"script": "kanbn.board.json",
"problemMatcher": []
},
{
"label": "Jest",
"type": "shell",
"command": "npx jest",
"problemMatcher": []
},
{
"label": "File-Folder-Helper AOT s X Day-Helper-2025-03-20",
"type": "shell",
"command": "L:/DevOps/Mesa_FI/File-Folder-Helper/bin/Release/net8.0/win-x64/publish/File-Folder-Helper.exe",
"args": [
"s",
"X",
"L:/DevOps/Mesa_FI/File-Folder-Helper",
"Day-Helper-2025-03-20",
"false",
"4"
],
"problemMatcher": []
}
]
}

View File

@ -6,7 +6,7 @@ using System.Xml;
using System.Xml.Linq;
using System.Xml.Serialization;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI1;
#pragma warning disable IDE1006, CS8618
@ -1089,7 +1089,7 @@ internal static partial class Helper20240105
port = new System.Uri(uri).Port;
}
List<Field> itemFields = (from l in keePassFileGroupEntryStrings select new Field(l.Key, l.Value.Value, 0)).ToList();
Login login = new(new Uri[] { new(uri, host, port) }, username, password);
Login login = new([new(uri, host, port)], username, password);
result = new(revisionDate,
creationTime,
folderId,
@ -1282,7 +1282,7 @@ internal static partial class Helper20240105
revisionDate = item.RevisionDate;
notes.Add($"{item.Login.Password} on {item.RevisionDate}");
}
login = new(new Uri[] { new(uri, host, port) }, username, password);
login = new([new(uri, host, port)], username, password);
result = new(revisionDate,
creationTime,
folderId,

View File

@ -1,16 +1,30 @@
using File_Folder_Helper.Models;
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
using System.Text;
using System.Text.Json;
using System.Text.Json.Serialization;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI1;
internal static partial class Helper20240106
{
private record Record(string Key, Dictionary<string, string> KeyValuePairs);
private record Host([property: JsonPropertyName("a")] string? Id,
[property: JsonPropertyName("b")] string? Colon,
[property: JsonPropertyName("c")] string? Hyphen,
[property: JsonPropertyName("d")] string? Line,
[property: JsonPropertyName("e")] string? Count,
[property: JsonPropertyName("f")] string? Segments,
[property: JsonPropertyName("g")] string? Type,
[property: JsonPropertyName("h")] string? Device,
[property: JsonPropertyName("i")] string? Name,
[property: JsonPropertyName("j")] string? Location);
[JsonSourceGenerationOptions(WriteIndented = true, AllowTrailingCommas = true)]
[JsonSerializable(typeof(Host[]))]
private partial class HostCollectionSourceGenerationContext : JsonSerializerContext
{
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Dictionary<string, Dictionary<string, string>>))]
@ -18,6 +32,8 @@ internal static partial class Helper20240106
{
}
private record Record(string Key, Dictionary<string, string> KeyValuePairs);
private static Dictionary<string, Dictionary<string, string>> GetKeyValuePairs(List<Record> collection, bool replaceFound)
{
Dictionary<string, Dictionary<string, string>> results = [];
@ -34,6 +50,27 @@ internal static partial class Helper20240106
return results;
}
private static Dictionary<int, Host> GetHosts(string jsonl)
{
Dictionary<int, Host> results = [];
int id;
string json = $"[{File.ReadAllText(jsonl).Replace("\r\n", ",")}]";
Host[] hosts = JsonSerializer.Deserialize(json, HostCollectionSourceGenerationContext.Default.HostArray) ?? throw new NullReferenceException(nameof(json));
foreach (Host host in hosts)
{
if (host.Id is null)
continue;
if (host.Hyphen is not null and nameof(host.Hyphen))
continue;
if (!int.TryParse(host.Id, out id))
throw new NotSupportedException($"{host.Id} is not a number");
if (results.ContainsKey(id))
throw new NotSupportedException($"Id {id} is not unique!");
results.Add(id, host);
}
return results;
}
private static int? GetHeaderLine(string[] lines)
{
int? headerLine = null;
@ -97,27 +134,6 @@ internal static partial class Helper20240106
return results;
}
private static Dictionary<int, Host> GetHosts(string jsonl)
{
Dictionary<int, Host> results = [];
int id;
string json = $"[{File.ReadAllText(jsonl).Replace("\r\n", ",")}]";
Host[] hosts = JsonSerializer.Deserialize(json, HostSourceGenerationContext.Default.HostArray) ?? throw new NullReferenceException(nameof(json));
foreach (Host host in hosts)
{
if (host.Id is null)
continue;
if (host.Hyphen is not null and nameof(host.Hyphen))
continue;
if (!int.TryParse(host.Id, out id))
throw new NotSupportedException($"{host.Id} is not a number");
if (results.ContainsKey(id))
throw new NotSupportedException($"Id {id} is not unique!");
results.Add(id, host);
}
return results;
}
private static ReadOnlyCollection<string> GetIpAddressAndVerify(ILogger<Worker> logger, string key, Dictionary<string, Dictionary<string, string>> keyValuePairs, Dictionary<int, Host> hosts, string filter)
{
List<string> results = [];

View File

@ -1,7 +1,7 @@
using DiscUtils.Iso9660;
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI1;
internal static partial class Helper20240107
{

View File

@ -2,16 +2,17 @@ using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
using System.Text.RegularExpressions;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI1;
internal static partial class Helper20240108
{
private record Method(string Name,
private record Method(int EndLine,
string FirstLine,
int FirstUsedLine,
string Name,
int ParameterCount,
int StartLine,
int EndLine,
int FirstUsedLine);
int StartLine);
[GeneratedRegex(@"(?<method>[A-Z]{1}[A-Za-z_0-9]*)\(")]
private static partial Regex CSharpMethodName();
@ -40,7 +41,7 @@ internal static partial class Helper20240108
for (int j = i - 1; j > -1; j--)
{
line = lines[j].Trim();
if (!line.StartsWith('[') && !line.StartsWith("/// "))
if (!line.StartsWith('[') && !line.StartsWith('#') && !line.StartsWith("/// "))
break;
result--;
}
@ -155,7 +156,9 @@ internal static partial class Helper20240108
string line;
string? name;
int startLine;
Method method;
string search;
string firstLine;
string innerLine;
string searchNot;
string searchWrap;
@ -186,12 +189,18 @@ internal static partial class Helper20240108
startLine = GetStartLine(lines, i);
searchConstructor = $"{name.ToLower()} = new(";
parameterCount = GetParameterCount(line, search);
isLinq = lines[i + 1].Trim() != "{";
if (!lines[startLine].StartsWith("#pragma"))
firstLine = lines[startLine].Trim();
else
firstLine = lines[startLine + 1].Trim();
isLinq = !lines[i + 1].StartsWith("#pragma") && lines[i + 1].Trim() != "{";
if (isLinq)
blocks++;
for (int j = i + 1; j < lines.Length; j++)
{
innerLine = lines[j].Trim();
if (innerLine.StartsWith("#pragma"))
continue;
if (isLinq && string.IsNullOrEmpty(innerLine))
{
if (line.EndsWith(';'))
@ -218,7 +227,8 @@ internal static partial class Helper20240108
}
if (j > lines.Length - 2)
throw new Exception();
results.Add(new(name, parameterCount, startLine, endLine, firstUsedLine.Value));
method = new(endLine, firstLine, firstUsedLine.Value, name, parameterCount, startLine);
results.Add(method);
break;
}
}
@ -257,12 +267,18 @@ internal static partial class Helper20240108
return result;
}
private static bool SortFile(ILogger<Worker> logger, string cSharpFile, string[] lines)
private static bool SortFile(ILogger<Worker> logger, bool logOnly, string cSharpFile, string[] lines)
{
bool result;
ReadOnlyCollection<Method> methods = GetMethods(cSharpFile, logger, lines);
if (methods.Count == 0)
result = false;
else if (logOnly)
{
foreach (Method method in methods.OrderBy(l => l.Name))
logger.LogInformation("{cSharpFile} - {Name} has {lines} line(s)", cSharpFile, method.Name, (method.EndLine - method.StartLine).ToString("000000"));
result = false;
}
else
result = WriteAllLines(cSharpFile, lines, methods);
return result;
@ -275,11 +291,12 @@ internal static partial class Helper20240108
string[] lines;
bool usePathCombine = true;
long ticks = DateTime.Now.Ticks;
bool logOnly = bool.Parse(args[3]);
logger.LogInformation("{ticks}", ticks);
string directory = Path.GetFullPath(args[2]);
string repositoryDirectory = Path.GetFullPath(args[0]);
string[] cSharpFiles = Directory.GetFiles(directory, "*.cs", SearchOption.AllDirectories);
ReadOnlyCollection<string> gitOthersModifiedAndDeletedExcludingStandardFiles = Helpers.HelperGit.GetOthersModifiedAndDeletedExcludingStandardFiles(repositoryDirectory, usePathCombine, cancellationToken);
ReadOnlyCollection<string> gitOthersModifiedAndDeletedExcludingStandardFiles = logOnly ? new(cSharpFiles) : Helpers.HelperGit.GetOthersModifiedAndDeletedExcludingStandardFiles(repositoryDirectory, usePathCombine, cancellationToken);
for (int i = 0; i < 10; i++)
{
foreach (string cSharpFile in cSharpFiles)
@ -287,11 +304,11 @@ internal static partial class Helper20240108
if (!gitOthersModifiedAndDeletedExcludingStandardFiles.Contains(cSharpFile))
continue;
lines = File.ReadAllLines(cSharpFile);
check = SortFile(logger, cSharpFile, lines);
check = SortFile(logger, logOnly, cSharpFile, lines);
if (check && !result)
result = true;
}
if (!result)
if (logOnly || !result)
break;
}
}

View File

@ -2,7 +2,7 @@ using Microsoft.Extensions.Logging;
using System.Text.Json;
using System.Text.Json.Serialization;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI1;
internal static partial class Helper20240129
{

View File

@ -1,7 +1,7 @@
using Microsoft.Extensions.Logging;
using System.Globalization;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI1;
internal static partial class Helper20240305
{

View File

@ -1,7 +1,7 @@
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI1;
internal static partial class Helper20240403
{
@ -42,7 +42,7 @@ internal static partial class Helper20240403
string keyIndex = args[5];
string directory = args[0];
logger.LogInformation(directory);
string[] columns = args[4].Split('|');
string[] columns = args[4].Split('~');
DynamicHostConfigurationProtocolConfiguration dynamicHostConfigurationProtocolConfiguration = new(columns, directory, ignore, int.Parse(keyIndex), pattern, primary);
AlertIfNewDeviceIsConnected(dynamicHostConfigurationProtocolConfiguration, logger);
}

View File

@ -1,7 +1,7 @@
using Microsoft.Extensions.Logging;
using System.Text.RegularExpressions;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI1;
internal static partial class Helper20240404
{

View File

@ -2,12 +2,12 @@ using Microsoft.Extensions.Logging;
using System.Text.Json;
using System.Text.Json.Serialization;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI1;
internal static partial class Helper20240409
{
internal record FsSize( // cSpell:disable
private record FsSize( // cSpell:disable
[property: JsonPropertyName("name")] string Name,
[property: JsonPropertyName("object")] string Object,
[property: JsonPropertyName("pmon")] string PMon,
@ -19,7 +19,7 @@ internal static partial class Helper20240409
[JsonSourceGenerationOptions(WriteIndented = true, AllowTrailingCommas = true)]
[JsonSerializable(typeof(FsSize))]
internal partial class FsSizeSourceGenerationContext : JsonSerializerContext
private partial class FsSizeSourceGenerationContext : JsonSerializerContext
{
}

View File

@ -2,7 +2,7 @@ using File_Folder_Helper.Helpers;
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI1;
internal static partial class Helper20240417
{

View File

@ -1,7 +1,7 @@
using Microsoft.Extensions.Logging;
using System.Text;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI1;
internal static partial class Helper20240426
{

View File

@ -1,14 +1,15 @@
using File_Folder_Helper.Helpers;
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
using System.Text.Json;
using System.Text.Json.Serialization;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI1;
internal static partial class Helper20240427
{
internal record Asset( // cSpell:disable
private record Asset( // cSpell:disable
[property: JsonPropertyName("id")] string? Id,
[property: JsonPropertyName("deviceAssetId")] string? DeviceAssetId,
[property: JsonPropertyName("ownerId")] string? OwnerId,
@ -41,7 +42,7 @@ internal static partial class Helper20240427
[JsonSourceGenerationOptions(WriteIndented = true, AllowTrailingCommas = true)]
[JsonSerializable(typeof(List<Asset>))]
internal partial class AssetCollectionSourceGenerationContext : JsonSerializerContext
private partial class AssetCollectionSourceGenerationContext : JsonSerializerContext
{
}
@ -54,7 +55,7 @@ internal static partial class Helper20240427
string checkDirectory = home;
string sourceDirectory = home;
string originalFileNameWithoutExtension = Path.GetFileNameWithoutExtension(originalFileName);
List<string> directoryNames = HelperDirectory.GetDirectoryNames(path);
ReadOnlyCollection<string> directoryNames = HelperDirectory.GetDirectoryNames(path);
for (int i = 0; i < directoryNames.Count; i++)
{
if (directoryNames[i] != lastVarDirectoryName)
@ -99,8 +100,8 @@ internal static partial class Helper20240427
private static void MoveAssets(ILogger<Worker> logger, string var, string home, string pictures, List<Asset> assets)
{
string? checkFile;
List<string> varDirectoryNames = HelperDirectory.GetDirectoryNames(home);
string lastVarDirectoryName = varDirectoryNames[^1];
ReadOnlyCollection<string> directoryNames = HelperDirectory.GetDirectoryNames(home);
string lastDirectoryName = directoryNames[^1];
foreach (Asset asset in assets)
{
if (asset.OriginalFileName is null)
@ -113,10 +114,10 @@ internal static partial class Helper20240427
continue;
if (asset.OriginalPath is null || !asset.OriginalPath.StartsWith(pictures))
continue;
checkFile = MoveAsset(home, asset.OriginalFileName, lastVarDirectoryName, asset.PreviewPath);
checkFile = MoveAsset(home, asset.OriginalFileName, lastDirectoryName, asset.PreviewPath);
if (checkFile is null)
continue;
checkFile = MoveAsset(home, asset.OriginalFileName, lastVarDirectoryName, asset.ThumbnailPath);
checkFile = MoveAsset(home, asset.OriginalFileName, lastDirectoryName, asset.ThumbnailPath);
if (checkFile is null)
continue;
logger.LogDebug("<{OriginalFileName}> moved.", asset.OriginalFileName);

View File

@ -1,7 +1,7 @@
using Microsoft.Extensions.Logging;
using System.Diagnostics;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI2;
internal static partial class Helper20240429
{

View File

@ -1,6 +1,6 @@
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI2;
internal static partial class Helper20240510
{

View File

@ -2,7 +2,7 @@ using File_Folder_Helper.Models;
using Microsoft.Extensions.Logging;
using System.Text.Json;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI2;
internal static partial class Helper20240513
{

View File

@ -3,31 +3,31 @@ using System.Collections.ObjectModel;
using System.Text.Json;
using System.Text.Json.Serialization;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI2;
internal static partial class Helper20240517
{
public record ContentSignature([property: JsonPropertyName("contentSignature")] string Value,
private record ContentSignature([property: JsonPropertyName("contentSignature")] string Value,
[property: JsonPropertyName("contentSignatureType")] string ContentSignatureType);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(ContentSignature))]
public partial class ContentSignatureGenerationContext : JsonSerializerContext
private partial class ContentSignatureGenerationContext : JsonSerializerContext
{
}
public record Type([property: JsonPropertyName("count")] int Count,
private record Type([property: JsonPropertyName("count")] int Count,
[property: JsonPropertyName("match")] string Match,
[property: JsonPropertyName("searchData")] SearchData SearchData);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Type))]
public partial class TypeGenerationContext : JsonSerializerContext
private partial class TypeGenerationContext : JsonSerializerContext
{
}
public record ImageAmazon([property: JsonPropertyName("colorSpace")] string ColorSpace,
private record ImageAmazon([property: JsonPropertyName("colorSpace")] string ColorSpace,
[property: JsonPropertyName("dateTime")] DateTime DateTime,
[property: JsonPropertyName("dateTimeDigitized")] DateTime DateTimeDigitized,
[property: JsonPropertyName("dateTimeOriginal")] DateTime DateTimeOriginal,
@ -54,11 +54,11 @@ internal static partial class Helper20240517
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(ImageAmazon))]
public partial class ImageAmazonGenerationContext : JsonSerializerContext
private partial class ImageAmazonGenerationContext : JsonSerializerContext
{
}
public record ContentProperties([property: JsonPropertyName("contentDate")] DateTime ContentDate,
private record ContentProperties([property: JsonPropertyName("contentDate")] DateTime ContentDate,
[property: JsonPropertyName("contentSignatures")] IReadOnlyList<ContentSignature> ContentSignatures,
[property: JsonPropertyName("contentType")] string ContentType,
[property: JsonPropertyName("extension")] string Extension,
@ -69,19 +69,19 @@ internal static partial class Helper20240517
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(ContentProperties))]
public partial class ContentPropertiesGenerationContext : JsonSerializerContext
private partial class ContentPropertiesGenerationContext : JsonSerializerContext
{
}
public record XAccntParentMap();
private record XAccntParentMap();
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(XAccntParentMap))]
public partial class XAccntParentMapGenerationContext : JsonSerializerContext
private partial class XAccntParentMapGenerationContext : JsonSerializerContext
{
}
public record Datum([property: JsonPropertyName("accessRuleIds")] IReadOnlyList<object> AccessRuleIds,
private record Datum([property: JsonPropertyName("accessRuleIds")] IReadOnlyList<object> AccessRuleIds,
[property: JsonPropertyName("childAssetTypeInfo")] IReadOnlyList<object> ChildAssetTypeInfo,
[property: JsonPropertyName("contentProperties")] ContentProperties ContentProperties,
[property: JsonPropertyName("createdBy")] string CreatedBy,
@ -111,27 +111,27 @@ internal static partial class Helper20240517
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Datum))]
public partial class DatumGenerationContext : JsonSerializerContext
private partial class DatumGenerationContext : JsonSerializerContext
{
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Dictionary<string, Datum>))]
public partial class DictionaryDatumGenerationContext : JsonSerializerContext
private partial class DictionaryDatumGenerationContext : JsonSerializerContext
{
}
public record LocationAmazon([property: JsonPropertyName("count")] int Count,
private record LocationAmazon([property: JsonPropertyName("count")] int Count,
[property: JsonPropertyName("match")] string Match,
[property: JsonPropertyName("searchData")] SearchData SearchData);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(LocationAmazon))]
public partial class LocationAmazonGenerationContext : JsonSerializerContext
private partial class LocationAmazonGenerationContext : JsonSerializerContext
{
}
public record LocationInfo([property: JsonPropertyName("city")] string City,
private record LocationInfo([property: JsonPropertyName("city")] string City,
[property: JsonPropertyName("country")] string Country,
[property: JsonPropertyName("countryIso3Code")] string CountryIso3Code,
[property: JsonPropertyName("state")] string State,
@ -139,74 +139,74 @@ internal static partial class Helper20240517
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(LocationInfo))]
public partial class LocationInfoGenerationContext : JsonSerializerContext
private partial class LocationInfoGenerationContext : JsonSerializerContext
{
}
public record SearchData([property: JsonPropertyName("clusterName")] string ClusterName,
private record SearchData([property: JsonPropertyName("clusterName")] string ClusterName,
[property: JsonPropertyName("locationId")] string LocationId,
[property: JsonPropertyName("locationInfo")] LocationInfo LocationInfo,
[property: JsonPropertyName("thingId")] string ThingId);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(SearchData))]
public partial class SearchDataGenerationContext : JsonSerializerContext
private partial class SearchDataGenerationContext : JsonSerializerContext
{
}
public record AllPerson([property: JsonPropertyName("count")] int Count,
private record AllPerson([property: JsonPropertyName("count")] int Count,
[property: JsonPropertyName("match")] string Match,
[property: JsonPropertyName("searchData")] SearchData SearchData);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(AllPerson))]
public partial class AllPersonGenerationContext : JsonSerializerContext
private partial class AllPersonGenerationContext : JsonSerializerContext
{
}
public record PersonAmazon([property: JsonPropertyName("count")] int Count,
private record PersonAmazon([property: JsonPropertyName("count")] int Count,
[property: JsonPropertyName("match")] string Match,
[property: JsonPropertyName("searchData")] SearchData SearchData);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(PersonAmazon))]
public partial class PersonAmazonGenerationContext : JsonSerializerContext
private partial class PersonAmazonGenerationContext : JsonSerializerContext
{
}
public record ClusterId([property: JsonPropertyName("count")] int Count,
private record ClusterId([property: JsonPropertyName("count")] int Count,
[property: JsonPropertyName("match")] string Match,
[property: JsonPropertyName("searchData")] SearchData SearchData);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(ClusterId))]
public partial class ClusterIdGenerationContext : JsonSerializerContext
private partial class ClusterIdGenerationContext : JsonSerializerContext
{
}
public record Thing([property: JsonPropertyName("count")] int Count,
private record Thing([property: JsonPropertyName("count")] int Count,
[property: JsonPropertyName("match")] string Match,
[property: JsonPropertyName("searchData")] SearchData SearchData);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Thing))]
public partial class ThingGenerationContext : JsonSerializerContext
private partial class ThingGenerationContext : JsonSerializerContext
{
}
public record Time([property: JsonPropertyName("count")] int Count,
private record Time([property: JsonPropertyName("count")] int Count,
[property: JsonPropertyName("match")] string Match,
[property: JsonPropertyName("searchData")] SearchData SearchData);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Time))]
public partial class TimeGenerationContext : JsonSerializerContext
private partial class TimeGenerationContext : JsonSerializerContext
{
}
public record ParentMap([property: JsonPropertyName("FOLDER")] IReadOnlyList<string> FOLDER);
private record ParentMap([property: JsonPropertyName("FOLDER")] IReadOnlyList<string> FOLDER);
public record Aggregations([property: JsonPropertyName("allPeople")] IReadOnlyList<AllPerson> AllPeople,
private record Aggregations([property: JsonPropertyName("allPeople")] IReadOnlyList<AllPerson> AllPeople,
[property: JsonPropertyName("clusterId")] IReadOnlyList<ClusterId> ClusterId,
[property: JsonPropertyName("location")] IReadOnlyList<LocationAmazon> Location,
[property: JsonPropertyName("people")] IReadOnlyList<PersonAmazon> People,
@ -216,23 +216,23 @@ internal static partial class Helper20240517
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Aggregations))]
public partial class AggregationsGenerationContext : JsonSerializerContext
private partial class AggregationsGenerationContext : JsonSerializerContext
{
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(ParentMap))]
public partial class ParentMapGenerationContext : JsonSerializerContext
private partial class ParentMapGenerationContext : JsonSerializerContext
{
}
public record RootAmazon([property: JsonPropertyName("aggregations")] Aggregations Aggregations,
private record RootAmazon([property: JsonPropertyName("aggregations")] Aggregations Aggregations,
[property: JsonPropertyName("count")] int Count,
[property: JsonPropertyName("data")] IReadOnlyList<Datum> Data);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(RootAmazon))]
public partial class RootAmazonGenerationContext : JsonSerializerContext
private partial class RootAmazonGenerationContext : JsonSerializerContext
{
}

View File

@ -2,7 +2,7 @@ using File_Folder_Helper.Models;
using Microsoft.Extensions.Logging;
using System.Text.Json;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI2;
internal static partial class Helper20240518
{

View File

@ -2,7 +2,7 @@ using File_Folder_Helper.Helpers;
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI2;
internal static partial class Helper20240519
{

View File

@ -4,7 +4,7 @@ using System.Collections.ObjectModel;
using System.Text.Json;
using System.Text.Json.Serialization;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI2;
internal static partial class Helper20240520
{

View File

@ -0,0 +1,734 @@
using File_Folder_Helper.Helpers;
using File_Folder_Helper.Models;
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
using System.Diagnostics;
using System.Text;
using System.Text.Json;
using System.Text.Json.Serialization;
using System.Text.RegularExpressions;
namespace File_Folder_Helper.ADO2024.PI2;
internal static partial class Helper20240623
{
[GeneratedRegex("([A-Z]+(.))")]
private static partial Regex UpperCase();
[GeneratedRegex("[\\s!?.,@:;|\\\\/\"'`£$%\\^&*{}[\\]()<>~#+\\-=_¬]+")]
private static partial Regex InvalidCharacter();
private record H1AndParamCase(string H1, string ParamCase);
private record SubTaskLine(string Text, bool Started, bool Completed, long? Ticks, int? Line);
private record Record(int? CodeInsidersLine, FileInfo FileInfo, LineNumber LineNumber, int? StopLine, int? SubTasksLine);
private record Input(long? AfterEpochTotalMilliseconds,
string CodeInsiders,
ReadOnlyCollection<string> DestinationDirectories,
string DirectoryFilter,
string Done,
string IndexFile,
string SearchPattern,
string SubTasks,
string SourceDirectory,
ReadOnlyCollection<string> Tasks);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Input))]
private partial class InputSourceGenerationContext : JsonSerializerContext
{
}
private static Record GetRecord(Input input, FileInfo fileInfo)
{
Record result;
string line;
int? stopLine = null;
int? subTasksLine = null;
int? codeInsidersLine = null;
LineNumber lineNumber = HelperMarkdown.GetLineNumbers(fileInfo);
for (int i = 0; i < lineNumber.Lines.Count; i++)
{
line = lineNumber.Lines[i];
if (line.StartsWith(input.CodeInsiders) && line[^1] == ')')
codeInsidersLine = i;
if (line != input.SubTasks)
continue;
subTasksLine = i;
if (codeInsidersLine is null)
break;
if (lineNumber.Lines.Count > i)
{
for (int j = i + 1; j < lineNumber.Lines.Count; j++)
{
if (lineNumber.Lines[j].Length > 0 && lineNumber.Lines[j][0] == '#')
{
stopLine = j;
break;
}
}
}
stopLine ??= lineNumber.Lines.Count;
break;
}
result = new(codeInsidersLine, fileInfo, lineNumber, stopLine, subTasksLine);
return result;
}
private static List<Record> GetRecords(Input input)
{
List<Record> results = [];
Record record;
FileInfo fileInfo;
string sourceDirectory = input.SourceDirectory;
ReadOnlyCollection<string> directoryNames = HelperDirectory.GetDirectoryNames(input.SourceDirectory);
if (!directoryNames.Any(l => l.StartsWith(input.DirectoryFilter, StringComparison.CurrentCultureIgnoreCase)))
{
string directoryName;
string[] checkDirectories = Directory.GetDirectories(input.SourceDirectory, "*", SearchOption.TopDirectoryOnly);
foreach (string checkDirectory in checkDirectories)
{
directoryName = Path.GetFileName(checkDirectory);
if (directoryName.StartsWith(input.DirectoryFilter, StringComparison.CurrentCultureIgnoreCase))
{
sourceDirectory = checkDirectory;
break;
}
}
}
string[] subDirectories = Directory.GetDirectories(sourceDirectory, "*", SearchOption.TopDirectoryOnly);
List<string> files = Directory.GetFiles(sourceDirectory, input.SearchPattern, SearchOption.TopDirectoryOnly).ToList();
foreach (string subDirectory in subDirectories)
files.AddRange(Directory.GetFiles(subDirectory, input.SearchPattern, SearchOption.TopDirectoryOnly));
foreach (string file in files)
{
fileInfo = new(file);
record = GetRecord(input, fileInfo);
results.Add(record);
}
return results;
}
private static string GetParamCase(string value)
{
string result;
StringBuilder stringBuilder = new(value);
Match[] matches = UpperCase().Matches(value).ToArray();
for (int i = matches.Length - 1; i > -1; i--)
_ = stringBuilder.Insert(matches[i].Index, '-');
string[] segments = InvalidCharacter().Split(stringBuilder.ToString().ToLower());
result = string.Join('-', segments).Trim('-');
return result;
}
private static ReadOnlyCollection<SubTaskLine> GetSubTaskLines(Input input, bool? foundStarted, bool? foundCompleted, string fallbackLine, Record record)
{
List<SubTaskLine> results = [];
char done;
string line;
string text;
bool startedValue;
bool completedValue;
SubTaskLine subTaskLine;
bool foundSubTasks = false;
int tasksZeroLength = input.Tasks[0].Length;
long ticks = record.FileInfo.LastWriteTime.Ticks;
for (int i = 0; i < record.LineNumber.Lines.Count; i++)
{
line = record.LineNumber.Lines[i];
if (!foundSubTasks && line == input.SubTasks)
foundSubTasks = true;
if (!foundSubTasks)
continue;
if (line.Length <= tasksZeroLength || !line.StartsWith(input.Tasks[0]) || line[tasksZeroLength] is not ' ' and not 'x' || line[tasksZeroLength + 1] != ']')
continue;
startedValue = foundStarted is not null && foundStarted.Value;
completedValue = foundCompleted is not null && foundCompleted.Value;
subTaskLine = new(Text: $" {line}", Started: startedValue, Completed: completedValue, Ticks: ticks, Line: i);
results.Add(subTaskLine);
}
startedValue = foundStarted is not null && foundStarted.Value;
completedValue = foundCompleted is not null && foundCompleted.Value;
if (record.LineNumber.H1 is null)
subTaskLine = new(Text: fallbackLine, Started: startedValue, Completed: completedValue, Ticks: ticks, Line: null);
else
{
done = foundCompleted is null || !foundCompleted.Value ? ' ' : 'x';
string codeInsidersLine = record.CodeInsidersLine is null ? string.Empty : $" ~~{record.LineNumber.Lines[record.CodeInsidersLine.Value]}~~";
text = $"- [{done}] {ticks} {record.LineNumber.Lines[record.LineNumber.H1.Value]}{codeInsidersLine}";
subTaskLine = new(Text: text, Started: startedValue, Completed: completedValue, Ticks: ticks, Line: 0);
}
results.Add(subTaskLine);
return new(results);
}
private static string GetSeasonName(int dayOfYear)
{
string result = dayOfYear switch
{
< 78 => "0.Winter",
< 124 => "1.Spring",
< 171 => "2.Spring",
< 217 => "3.Summer",
< 264 => "4.Summer",
< 309 => "5.Fall",
< 354 => "6.Fall",
_ => "7.Winter"
};
return result;
}
private static string[] GetIndexLines(string h1, ReadOnlyCollection<H1AndParamCase> h1ParamCaseCollection) =>
[
"---",
"startedColumns:",
" - 'In Progress'",
"completedColumns:",
" - Done",
"---",
string.Empty,
h1[0] == '#' ? h1 : $"# {h1}",
string.Empty,
"## Backlog",
string.Empty,
string.Join(Environment.NewLine, h1ParamCaseCollection.Select(l => $"- [{l.ParamCase}](tasks/{l.ParamCase}.md)")),
string.Empty,
"## Todo",
string.Empty,
"## In Progress",
string.Empty,
"## Done",
string.Empty
];
private static string[] GetCascadingStyleSheetsLines() =>
[
".kanbn-column-done .kanbn-column-task-list {",
" border-color: #198038;",
"}",
string.Empty,
".kanbn-task-data-created {",
" display: none;",
"}",
string.Empty,
".kanbn-task-data-workload {",
" display: none;",
"}"
];
private static string GetSettingsLines() =>
/*lang=json,strict*/ """
{
"[markdown]": {
"editor.wordWrap": "off"
},
"cSpell.words": [
"kanbn"
]
}
""";
private static string GetTasksLines(string directory) =>
/*lang=json,strict*/ """
{
"version": "2.0.0",
"tasks": [
{
"label": "File-Folder-Helper AOT s X Day-Helper-2024-06-23",
"type": "shell",
"command": "L:/DevOps/Mesa_FI/File-Folder-Helper/bin/Release/net8.0/win-x64/publish/File-Folder-Helper.exe",
"args": [
"s",
"X",
"{}",
"Day-Helper-2024-06-23",
"*.md",
"##_Sub-tasks",
"-_[code-insiders](",
"index.md",
"-_[,](",
"##_Done",
".kan",
"D:/5-Other-Small/Kanban/Year-Season",
"316940400000"
],
"problemMatcher": []
}
]
}
""".Replace("{}", directory.Replace('\\', '/'));
private static void FileWriteAllText(string path, string contents)
{
// string checkJson = Regex.Replace(File.ReadAllText(path), @"\s+", " ", RegexOptions.Multiline);
// if (Regex.Replace(singletonJson, @"\s+", " ", RegexOptions.Multiline) != checkJson)
// File.WriteAllText(path, singletonJson);
string old = !File.Exists(path) ? string.Empty : File.ReadAllText(path);
if (old != contents)
File.WriteAllText(path, contents);
}
private static void FileWriteAllText(string path, string[] contents) =>
FileWriteAllText(path, string.Join(Environment.NewLine, contents));
private static ReadOnlyCollection<H1AndParamCase> GetH1ParamCaseCollection(Input input, ReadOnlyCollection<string> lines)
{
List<H1AndParamCase> results = [];
string h1;
string line;
string paramCase;
bool foundSubTasks = false;
H1AndParamCase h1AndParamCase;
int tasksZeroLength = input.Tasks[0].Length;
for (int i = 0; i < lines.Count; i++)
{
line = lines[i];
if (!foundSubTasks && line == input.SubTasks)
foundSubTasks = true;
if (!foundSubTasks)
continue;
if (line.Length <= tasksZeroLength || !line.StartsWith(input.Tasks[0]) || line[tasksZeroLength] is not ' ' and not 'x' || line[tasksZeroLength + 1] != ']')
continue;
h1 = line[(tasksZeroLength + 3)..];
if (string.IsNullOrEmpty(h1))
continue;
paramCase = GetParamCase(h1);
h1AndParamCase = new(h1, paramCase);
results.Add(h1AndParamCase);
}
return results.AsReadOnly();
}
private static void CreateFiles(string directory, ReadOnlyCollection<H1AndParamCase> h1ParamCaseCollection)
{
foreach (H1AndParamCase h1ParamCase in h1ParamCaseCollection)
FileWriteAllText(Path.Combine(directory, $"{h1ParamCase.ParamCase}.md"), $"# {h1ParamCase.H1}");
}
private static string WriteAndGetIndexFile(string h1, string verifiedDirectory, ReadOnlyCollection<H1AndParamCase> h1ParamCaseCollection)
{
string result;
string[] indexLines = GetIndexLines(h1, h1ParamCaseCollection);
string kanbanDirectory = Path.Combine(verifiedDirectory, ".kanbn");
string tasksKanbanDirectory = Path.Combine(kanbanDirectory, "tasks");
if (!Directory.Exists(tasksKanbanDirectory))
_ = Directory.CreateDirectory(tasksKanbanDirectory);
string verifiedVisualStudioCodeDirectory = Path.Combine(verifiedDirectory, ".vscode");
if (!Directory.Exists(verifiedVisualStudioCodeDirectory))
_ = Directory.CreateDirectory(verifiedVisualStudioCodeDirectory);
result = Path.Combine(kanbanDirectory, "index.md");
CreateFiles(tasksKanbanDirectory, h1ParamCaseCollection);
FileWriteAllText(result, indexLines);
FileWriteAllText(Path.Combine(kanbanDirectory, "board.css"), GetCascadingStyleSheetsLines());
FileWriteAllText(Path.Combine(verifiedVisualStudioCodeDirectory, "settings.json"), GetSettingsLines());
FileWriteAllText(Path.Combine(verifiedVisualStudioCodeDirectory, "tasks.json"), GetTasksLines(verifiedDirectory));
return result;
}
private static ReadOnlyCollection<string> GetXColumns(Input input, int frontMatterYamlEnd, int value, ReadOnlyCollection<string> lines)
{
List<string> results = [];
string[] segments;
for (int i = value + 1; i < frontMatterYamlEnd; i++)
{
segments = lines[i].Replace("\t", " ").Split(" - ");
if (segments.Length != 2)
break;
results.Add($"## {segments[1].Replace("'", string.Empty)}");
}
if (results.Count == 0)
results.Add(input.Done);
return results.AsReadOnly();
}
private static ReadOnlyCollection<string> GetCompletedColumns(Input input, LineNumber lineNumber)
{
List<string> results;
if (lineNumber.FrontMatterYamlEnd is null || lineNumber.CompletedColumns is null)
results = [];
else
results = GetXColumns(input, lineNumber.FrontMatterYamlEnd.Value, lineNumber.CompletedColumns.Value, lineNumber.Lines).ToList();
if (results.Count == 0)
results.Add(input.Done);
return results.AsReadOnly();
}
private static ReadOnlyCollection<string> GetStartedColumns(Input input, LineNumber lineNumber)
{
List<string> results;
if (lineNumber.FrontMatterYamlEnd is null || lineNumber.StartedColumns is null)
results = [];
else
results = GetXColumns(input, lineNumber.FrontMatterYamlEnd.Value, lineNumber.StartedColumns.Value, lineNumber.Lines).ToList();
if (results.Count == 0)
results.Add(input.Done);
return results.AsReadOnly();
}
private static ReadOnlyCollection<SubTaskLine> GetSubTaskLines(Input input, FileInfo fileInfo, LineNumber lineNumber)
{
List<SubTaskLine> results = [];
FileInfo f;
Record record;
char completed;
string[] segments;
bool startedValue;
bool completedValue;
string fallbackLine;
bool? foundStarted = null;
bool? foundCompleted = null;
ReadOnlyCollection<SubTaskLine> subTaskLines;
ReadOnlyCollection<string> startedColumns = GetStartedColumns(input, lineNumber);
ReadOnlyCollection<string> completedColumns = GetCompletedColumns(input, lineNumber);
int start = lineNumber.FrontMatterYamlEnd is null ? 0 : lineNumber.FrontMatterYamlEnd.Value + 1;
for (int i = start; i < lineNumber.Lines.Count; i++)
{
if ((foundStarted is null || !foundStarted.Value) && startedColumns.Any(lineNumber.Lines[i].StartsWith))
foundStarted = true;
if ((foundCompleted is null || !foundCompleted.Value) && completedColumns.Any(lineNumber.Lines[i].StartsWith))
foundCompleted = true;
segments = lineNumber.Lines[i].Split(input.Tasks[1]);
startedValue = foundStarted is not null && foundStarted.Value;
completedValue = foundCompleted is not null && foundCompleted.Value;
if (segments.Length > 2 || !segments[0].StartsWith(input.Tasks[0]))
continue;
completed = foundCompleted is null || !foundCompleted.Value ? ' ' : 'x';
fallbackLine = $"- [{completed}] {segments[0][input.Tasks[0].Length..]} ~~FallbackLine~~";
if (string.IsNullOrEmpty(fileInfo.DirectoryName))
continue;
f = new(Path.GetFullPath(Path.Combine(fileInfo.DirectoryName, segments[1][..^1])));
if (!f.Exists)
{
results.Add(new(Text: fallbackLine, Started: startedValue, Completed: completedValue, Ticks: null, Line: null));
continue;
}
record = GetRecord(input, f);
subTaskLines = GetSubTaskLines(input, startedValue, completedValue, fallbackLine, record);
for (int j = subTaskLines.Count - 1; j >= 0; j--)
results.Add(subTaskLines[j]);
}
return results.AsReadOnly();
}
private static Input GetInput(List<string> args)
{
string indexFile = args[5];
string searchPattern = args[2];
string directoryFilter = args[8];
string done = args[7].Replace('_', ' ');
string subTasks = args[3].Replace('_', ' ');
string codeInsiders = args[4].Replace('_', ' ');
string sourceDirectory = Path.GetFullPath(args[0]);
string[] tasks = args[6].Split(',').Select(l => l.Replace('_', ' ')).ToArray();
long? afterEpochTotalMilliseconds = args.Count < 11 ? null : long.Parse(args[10]);
string[] destinationDirectories = args.Count < 10 ? [] : args.Count < 12 ? [Path.GetFullPath(args[9])] : [Path.GetFullPath(args[9]), Path.GetFullPath(args[11])];
Input input = new(AfterEpochTotalMilliseconds: afterEpochTotalMilliseconds,
CodeInsiders: codeInsiders,
DestinationDirectories: destinationDirectories.AsReadOnly(),
DirectoryFilter: directoryFilter,
Done: done,
IndexFile: indexFile,
SearchPattern: searchPattern,
SubTasks: subTasks,
SourceDirectory: sourceDirectory,
Tasks: tasks.AsReadOnly());
if (input.Tasks[0] != "- [" || input.Tasks[1] != "](")
throw new Exception(JsonSerializer.Serialize(input, InputSourceGenerationContext.Default.Input));
return input;
}
private static string? MaybeWriteAndGetIndexFile(Input input, Record record, string? checkDirectory)
{
string? result;
if (string.IsNullOrEmpty(checkDirectory) || input.AfterEpochTotalMilliseconds is null || input.DestinationDirectories.Count == 0)
result = null;
else
{
if (!input.DestinationDirectories.Any(checkDirectory.Contains))
result = null;
else
{
if (record.LineNumber.H1 is null)
result = null;
else
{
string segment = Path.GetFileName(checkDirectory);
string h1 = record.LineNumber.Lines[record.LineNumber.H1.Value];
DateTime utcEpochDateTime = new(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
long utcEpochTotalMilliseconds = (long)Math.Floor(DateTime.UtcNow.Subtract(utcEpochDateTime).TotalMilliseconds);
if (!long.TryParse(segment, out long check) || check < input.AfterEpochTotalMilliseconds || check > utcEpochTotalMilliseconds)
result = null;
else
{
ReadOnlyCollection<H1AndParamCase> h1ParamCaseCollection = GetH1ParamCaseCollection(input, record.LineNumber.Lines);
if (h1ParamCaseCollection.Count == 0)
result = null;
else
{
DateTime dateTime = utcEpochDateTime.AddMilliseconds(check).ToLocalTime();
string seasonName = GetSeasonName(dateTime.DayOfYear);
ReadOnlyCollection<string> directoryNames = HelperDirectory.GetDirectoryNames(checkDirectory);
if (!directoryNames.Contains(dateTime.Year.ToString()) || !directoryNames.Contains($"{dateTime.Year}-{seasonName}") || !directoryNames.Contains(check.ToString()))
result = null;
else
result = WriteAndGetIndexFile(h1, checkDirectory, h1ParamCaseCollection);
}
}
}
}
}
return result;
}
private static bool FileWrite(long ticks, Record record, List<string> newLines, double percent)
{
bool result = false;
if (record.StopLine is not null && record.SubTasksLine is not null)
{
string contents;
string progressLine;
List<string> resultLines;
if (record.FileInfo.LastWriteTime.Ticks <= ticks)
resultLines = record.LineNumber.Lines.ToList();
else
resultLines = File.ReadAllLines(record.FileInfo.FullName).ToList();
if (record.LineNumber.FrontMatterYamlEnd is not null)
{
progressLine = $"progress: {percent}";
if (record.LineNumber.Progress is not null)
resultLines[record.LineNumber.Progress.Value] = progressLine;
else
{
resultLines.Insert(record.LineNumber.FrontMatterYamlEnd.Value, progressLine);
contents = string.Join(Environment.NewLine, resultLines);
FileWriteAllText(record.FileInfo.FullName, contents);
result = true;
}
if (!result && record.LineNumber.Completed is null && percent > 99.9)
{
resultLines.Insert(record.LineNumber.FrontMatterYamlEnd.Value, $"completed: {DateTime.Now:yyyy-MM-dd}");
contents = string.Join(Environment.NewLine, resultLines);
FileWriteAllText(record.FileInfo.FullName, contents);
result = true;
}
if (!result && record.LineNumber.Completed is not null && percent < 99.9)
{
resultLines.RemoveAt(record.LineNumber.Completed.Value);
contents = string.Join(Environment.NewLine, resultLines);
FileWriteAllText(record.FileInfo.FullName, contents);
result = true;
}
}
if (!result)
{
for (int i = record.StopLine.Value - 1; i > record.SubTasksLine.Value + 1; i--)
resultLines.RemoveAt(i);
if (record.StopLine.Value == record.LineNumber.Lines.Count && resultLines[^1].Length == 0)
resultLines.RemoveAt(resultLines.Count - 1);
for (int i = 0; i < newLines.Count; i++)
resultLines.Insert(record.SubTasksLine.Value + 1 + i, newLines[i]);
resultLines.Insert(record.SubTasksLine.Value + 1, string.Empty);
contents = string.Join(Environment.NewLine, resultLines);
FileWriteAllText(record.FileInfo.FullName, contents);
}
}
return result;
}
private static string? GetInferredCheckDirectory(string directory)
{
string? result = null;
List<string> directoryNames = [];
DirectoryInfo directoryInfo;
string? checkDirectory = directory;
directoryNames.Add(Path.GetFileName(checkDirectory));
string pathRoot = Path.GetPathRoot(directory) ?? throw new Exception();
for (int i = 0; i < directory.Length; i++)
{
checkDirectory = Path.GetDirectoryName(checkDirectory);
if (string.IsNullOrEmpty(checkDirectory) || checkDirectory == pathRoot)
break;
directoryInfo = new(checkDirectory);
if (!directoryInfo.Exists)
directoryNames.Add(directoryInfo.Name);
else
{
directoryNames.Reverse();
result = string.IsNullOrEmpty(directoryInfo.LinkTarget) ? checkDirectory : directoryInfo.LinkTarget;
for (int j = 0; j < directoryNames.Count; j++)
result = Path.GetDirectoryName(result) ?? throw new Exception();
foreach (string directoryName in directoryNames)
result = Path.Combine(result, directoryName);
break;
}
}
return result;
}
private static void UpdateFileAndStartNewProcess(ILogger<Worker> logger, Input input, Record record, string inferredCheckDirectory)
{
if (record.CodeInsidersLine is null)
throw new Exception();
List<string> lines = record.LineNumber.Lines.ToList();
lines[record.CodeInsidersLine.Value] = $"{input.CodeInsiders}{inferredCheckDirectory})";
string text = string.Join(Environment.NewLine, lines);
File.WriteAllText(record.FileInfo.FullName, text);
record.FileInfo.Refresh();
string file = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData), "Programs", "Microsoft VS Code Insiders", "Code - Insiders.exe");
try
{ _ = Process.Start(file, $"\"{inferredCheckDirectory}\""); }
catch (Exception) { logger.LogWarning("Failed to start code-insiders!"); }
}
private static FileInfo GetIndexFileInfo(ILogger<Worker> logger, Input input, Record record)
{
FileInfo result;
string? indexFile;
List<string> results;
if (record.CodeInsidersLine is null)
throw new Exception();
string codeInsidersLine = record.LineNumber.Lines[record.CodeInsidersLine.Value];
string raw = codeInsidersLine[input.CodeInsiders.Length..^1];
string checkDirectory = $"{raw[..2].ToUpper()}{raw[2..]}";
if (!Directory.Exists(checkDirectory))
{
if (input.DestinationDirectories.Count > 0 && input.DestinationDirectories.Any(checkDirectory.Contains))
{
string? inferredCheckDirectory = GetInferredCheckDirectory(checkDirectory);
if (!string.IsNullOrEmpty(inferredCheckDirectory))
{
checkDirectory = inferredCheckDirectory;
_ = Directory.CreateDirectory(inferredCheckDirectory);
UpdateFileAndStartNewProcess(logger, input, record, inferredCheckDirectory);
}
}
}
if (!Directory.Exists(checkDirectory))
results = [];
else
{
results = Directory.GetFiles(checkDirectory, input.IndexFile, SearchOption.AllDirectories).ToList();
if (results.Count != 1)
{
for (int i = results.Count - 1; i > -1; i--)
{
if (!results[i].Contains(input.DirectoryFilter, StringComparison.CurrentCultureIgnoreCase))
results.RemoveAt(i);
}
}
if (results.Count == 0)
{
indexFile = MaybeWriteAndGetIndexFile(input, record, checkDirectory);
if (!string.IsNullOrEmpty(indexFile))
results.Add(indexFile);
else
logger.LogInformation("<{checkDirectory}>", checkDirectory);
}
}
result = results.Count == 0 ? new(Path.Combine(checkDirectory, input.IndexFile)) : new(results[0]);
return result;
}
internal static void UpdateSubTasksInMarkdownFiles(ILogger<Worker> logger, List<string> args)
{
bool reload;
int allCount;
int lineCheck;
double percent;
string replace;
FileInfo fileInfo;
double startedCount;
List<Record> records;
double completedCount;
LineNumber lineNumber;
List<string> newLines;
bool reloadAny = false;
string? checkDirectory;
List<string> oldLines = [];
Input input = GetInput(args);
string fileNameWithoutExtension;
long ticks = DateTime.Now.Ticks;
ReadOnlyCollection<SubTaskLine> subTaskLines;
for (int z = 0; z < 9; z++)
{
records = GetRecords(input);
foreach (Record record in from l in records orderby l.SubTasksLine is null, l.CodeInsidersLine is null select l)
{
if (record.SubTasksLine is null)
continue;
fileNameWithoutExtension = Path.GetFileNameWithoutExtension(record.FileInfo.FullName);
if (record.CodeInsidersLine is not null)
logger.LogInformation("<{file}> has [{subTasks}]", fileNameWithoutExtension, input.SubTasks);
else
{
logger.LogWarning("<{file}> has [{subTasks}] but doesn't have [{codeInsiders}]!", fileNameWithoutExtension, input.SubTasks, input.CodeInsiders);
continue;
}
if (record.StopLine is null)
continue;
fileInfo = GetIndexFileInfo(logger, input, record);
if (!fileInfo.Exists)
{
logger.LogError("<{checkDirectory}> doesn't have a [{indexFile}]", fileInfo.DirectoryName, input.IndexFile);
continue;
}
oldLines.Clear();
checkDirectory = fileInfo.DirectoryName;
lineNumber = HelperMarkdown.GetLineNumbers(fileInfo);
subTaskLines = GetSubTaskLines(input, fileInfo, lineNumber);
if (subTaskLines.Count == 0)
continue;
lineCheck = 0;
for (int i = record.SubTasksLine.Value + 1; i < record.StopLine.Value - 1; i++)
oldLines.Add(record.LineNumber.Lines[i]);
if (subTaskLines.Count == 0)
{
percent = 0;
replace = "0";
}
else
{
allCount = (from l in subTaskLines where l.Line is not null && l.Line.Value == 0 select 1).Count();
completedCount = (from l in subTaskLines where l.Line is not null && l.Line.Value == 0 && l.Completed select 1).Count();
startedCount = (from l in subTaskLines where l.Line is not null && l.Line.Value == 0 && l.Started && !l.Completed select 1).Count();
percent = allCount == 0 ? 0 : Math.Round(completedCount / allCount, 3);
// newLines.Insert(0, $"- [{done}] Sub-tasks {doneCount} of {allCount} [{percent * 100}%]");
replace = $"{allCount} » {startedCount} ✓ {completedCount} {Math.Floor(percent * 100)}%".Replace(" ✓ 0 0%", string.Empty).Replace(" 100%", string.Empty).Replace(" » 0", string.Empty);
}
if (subTaskLines.Any(l => l.Ticks is null))
newLines = (from l in subTaskLines
select l.Text).ToList();
else
{
newLines = (from l in subTaskLines
orderby l.Completed descending, l.Started descending, l.Ticks, l.Line
select l.Text.Replace($"{l.Ticks}", replace)).ToList();
}
if (newLines.Count == oldLines.Count)
{
for (int i = 0; i < newLines.Count; i++)
{
if (newLines[i] != record.LineNumber.Lines[record.SubTasksLine.Value + 1 + i])
continue;
lineCheck++;
}
if (lineCheck == newLines.Count)
continue;
}
if (string.IsNullOrEmpty(checkDirectory))
continue;
checkDirectory = Path.Combine(checkDirectory, DateTime.Now.Ticks.ToString());
_ = Directory.CreateDirectory(checkDirectory);
Thread.Sleep(500);
Directory.Delete(checkDirectory);
reload = FileWrite(ticks, record, newLines, percent);
if (!reloadAny && reload)
reloadAny = true;
}
if (!reloadAny)
break;
}
}
}

View File

@ -1,6 +1,6 @@
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI2;
internal static partial class Helper20240624
{

View File

@ -1,7 +1,7 @@
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI2;
internal static partial class Helper20240711
{

View File

@ -1,20 +1,37 @@
using File_Folder_Helper.Models;
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
using System.Text.Json;
using System.Text.Json.Serialization;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI2;
internal static partial class Helper20240718
{
private record Host([property: JsonPropertyName("a")] string? Id,
[property: JsonPropertyName("b")] string? Colon,
[property: JsonPropertyName("c")] string? Hyphen,
[property: JsonPropertyName("d")] string? Line,
[property: JsonPropertyName("e")] string? Count,
[property: JsonPropertyName("f")] string? Segments,
[property: JsonPropertyName("g")] string? Type,
[property: JsonPropertyName("h")] string? Device,
[property: JsonPropertyName("i")] string? Name,
[property: JsonPropertyName("j")] string? Location);
[JsonSourceGenerationOptions(WriteIndented = true, AllowTrailingCommas = true)]
[JsonSerializable(typeof(Host[]))]
private partial class HostsSourceGenerationContext : JsonSerializerContext
{
}
private static Host[] GetHosts(ILogger<Worker> logger, string file)
{
Host[] results;
string lines = File.ReadAllText(file);
string json = $"[{lines.Replace("\r\n", ",")}]";
logger.LogDebug(lines);
results = JsonSerializer.Deserialize(json, HostSourceGenerationContext.Default.HostArray) ?? throw new NullReferenceException();
results = JsonSerializer.Deserialize(json, HostsSourceGenerationContext.Default.HostArray) ?? throw new NullReferenceException();
return results;
}

View File

View File

@ -1,7 +1,7 @@
using Microsoft.Extensions.Logging;
using System.Diagnostics;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI2;
internal static partial class Helper20240728
{

View File

@ -0,0 +1,146 @@
#if WorkItems
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3.ConvertExcelToJson;
public class FIBacklogMesa
{
[JsonConstructor]
public FIBacklogMesa(string req,
string submitted,
string requestor,
string assignedTo,
string secondResource,
string subject,
string systemS,
string priority,
string training,
string prioritySubset,
string status,
string definition,
string updates,
string estEffortDays,
string commitDate,
string reCommitDate,
string uATAsOf,
string cmpDate,
string f20,
string f21,
string f22,
string f23,
string f24,
string f25,
string f26,
string f27,
string f28,
string f29,
string f30,
string f31,
string f32,
string f33)
{
Req = req;
Submitted = submitted;
Requestor = requestor;
AssignedTo = assignedTo;
SecondResource = secondResource;
Subject = subject;
SystemS = systemS;
Priority = priority;
Training = training;
PrioritySubset = prioritySubset;
Status = status;
Definition = definition;
Updates = updates;
EstEffortDays = estEffortDays;
CommitDate = commitDate;
ReCommitDate = reCommitDate;
UATAsOf = uATAsOf;
CMPDate = cmpDate;
F20 = f20;
F21 = f21;
F22 = f22;
F23 = f23;
F24 = f24;
F25 = f25;
F26 = f26;
F27 = f27;
F28 = f28;
F29 = f29;
F30 = f30;
F31 = f31;
F32 = f32;
F33 = f33;
}
public string Req { get; set; } // { init; get; }
public string Submitted { get; set; } // { init; get; }
public string Requestor { get; set; } // { init; get; }
[JsonPropertyName("Assigned To")]
public string AssignedTo { get; set; } // { init; get; }
[JsonPropertyName("Second Resource")]
public string SecondResource { get; set; } // { init; get; }
[JsonPropertyName("Subject - from Requestor")]
public string Subject { get; set; } // { init; get; }
[JsonPropertyName("System(s)")]
public string SystemS { get; set; } // { init; get; }
public string Priority { get; set; } // { init; get; }
[JsonPropertyName("Spec/ECN/Training")]
public string Training { get; set; } // { init; get; }
[JsonPropertyName("Qual/Eff")]
public string PrioritySubset { get; set; } // { init; get; }
public string Status { get; set; } // { init; get; }
[JsonPropertyName("Definition - from FI")]
public string Definition { get; set; } // { init; get; }
public string Updates { get; set; } // { init; get; }
[JsonPropertyName("Est Effort _(days)")]
public string EstEffortDays { get; set; } // { init; get; }
[JsonPropertyName("Commit Date")]
public string CommitDate { get; set; } // { init; get; }
[JsonPropertyName("Re-Commit Date")]
public string ReCommitDate { get; set; } // { init; get; }
[JsonPropertyName("UAT as of")]
public string UATAsOf { get; set; } // { init; get; }
[JsonPropertyName("CMP _Date")]
public string CMPDate { get; set; } // { init; get; }
public string F20 { get; set; } // { init; get; }
public string F21 { get; set; } // { init; get; }
public string F22 { get; set; } // { init; get; }
public string F23 { get; set; } // { init; get; }
public string F24 { get; set; } // { init; get; }
public string F25 { get; set; } // { init; get; }
public string F26 { get; set; } // { init; get; }
public string F27 { get; set; } // { init; get; }
public string F28 { get; set; } // { init; get; }
public string F29 { get; set; } // { init; get; }
public string F30 { get; set; } // { init; get; }
public string F31 { get; set; } // { init; get; }
public string F32 { get; set; } // { init; get; }
public string F33 { get; set; } // { init; get; }
}
[JsonSourceGenerationOptions(WriteIndented = true, DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull, PropertyNameCaseInsensitive = true)]
[JsonSerializable(typeof(FIBacklogMesa[]))]
internal partial class FIBacklogMesaCollectionSourceGenerationContext : JsonSerializerContext
{
}
[JsonSourceGenerationOptions(WriteIndented = true, DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull, PropertyNameCaseInsensitive = true)]
[JsonSerializable(typeof(FIBacklogMesa))]
internal partial class FIBacklogMesaSourceGenerationContext : JsonSerializerContext
{
}
#endif

View File

@ -1,10 +1,24 @@
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.Day;
namespace File_Folder_Helper.ADO2024.PI3;
internal static partial class Helper20240805
{
private static void RenameFiles(string find, string replace, string[] ignoreFileNames, string[] confFiles)
{
string checkFile;
foreach (string confFile in confFiles)
{
if (ignoreFileNames.Contains(confFile))
continue;
checkFile = confFile.Replace(find, replace);
if (File.Exists(checkFile))
continue;
File.Move(confFile, checkFile);
}
}
private static void RenameFiles(ILogger<Worker> logger, string sourceDirectory, string[] files, string[] lines, string globalSettingsFile)
{
string to;
@ -55,9 +69,14 @@ internal static partial class Helper20240805
internal static void RenameFiles(ILogger<Worker> logger, List<string> args)
{
string find = args[6];
string replace = args[7];
string sourceDirectory = Path.GetFullPath(args[0]);
string globalSettingsFile = Path.GetFullPath(args[4]);
string[] files = Directory.GetFiles(sourceDirectory, args[2], SearchOption.TopDirectoryOnly);
string[] confFiles = Directory.GetFiles(sourceDirectory, $"*{find}", SearchOption.TopDirectoryOnly);
string[] ignoreFileNames = args[8].Split(',').Select(l => Path.Combine(sourceDirectory, l)).ToArray();
RenameFiles(find, replace, ignoreFileNames, confFiles);
string checkFile = Path.Combine(sourceDirectory, args[3]);
if (files.Length == 0 || !File.Exists(checkFile))
logger.LogWarning("No found!");

View File

@ -0,0 +1,57 @@
using Microsoft.Extensions.Logging;
using System.Globalization;
namespace File_Folder_Helper.ADO2024.PI3;
internal static partial class Helper20240806
{
private static void TryArchiveFiles(string sourceDirectory, string pattern, string archiveDirectory, int minimumLength, int days)
{
string checkFile;
FileInfo fileInfo;
string weekOfYear;
string checkDirectory;
string[] directorySegments;
DateTime dateTime = DateTime.Now.AddDays(-days);
Calendar calendar = new CultureInfo("en-US").Calendar;
string[] sourceDirectorySegments = sourceDirectory.Split(Path.DirectorySeparatorChar);
string[] files = Directory.GetFiles(sourceDirectory, pattern, SearchOption.AllDirectories);
if (sourceDirectorySegments.Length < 1)
throw new Exception("Can't be root drive!");
foreach (string file in files)
{
fileInfo = new FileInfo(file);
if (string.IsNullOrEmpty(fileInfo.DirectoryName) || fileInfo.IsReadOnly || fileInfo.Length < minimumLength || fileInfo.LastWriteTime < dateTime)
continue;
directorySegments = fileInfo.DirectoryName.Split(Path.DirectorySeparatorChar);
if (directorySegments.Length < sourceDirectorySegments.Length)
continue;
weekOfYear = $"{fileInfo.LastWriteTime.Year}_Week_{calendar.GetWeekOfYear(fileInfo.LastWriteTime, CalendarWeekRule.FirstDay, DayOfWeek.Sunday):00}";
checkDirectory = string.Concat(archiveDirectory, Path.DirectorySeparatorChar, weekOfYear);
for (int i = sourceDirectorySegments.Length; i < directorySegments.Length; i++)
checkDirectory = string.Concat(checkDirectory, Path.DirectorySeparatorChar, directorySegments[i]);
checkDirectory = string.Concat(checkDirectory, Path.DirectorySeparatorChar, fileInfo.LastWriteTime.ToString("yyyy-MM-dd"));
if (!Directory.Exists(checkDirectory))
_ = Directory.CreateDirectory(checkDirectory);
checkFile = Path.Combine(checkDirectory, string.Concat(fileInfo.LastWriteTime.ToString("HH-mm-ss-fff"), "~", fileInfo.Name));
if (File.Exists(checkFile))
continue;
File.Move(fileInfo.FullName, checkFile);
}
}
internal static void ArchiveFiles(ILogger<Worker> logger, List<string> args)
{
string pattern = args[4];
int days = int.Parse(args[6]);
logger.LogInformation("Hello");
string sourceDirectory = args[0];
int minimumLength = int.Parse(args[5]);
int millisecondsDelay = int.Parse(args[2]);
string archiveDirectory = Path.GetFullPath(args[7]);
TryArchiveFiles(sourceDirectory, pattern, archiveDirectory, minimumLength, days);
Thread.Sleep(millisecondsDelay);
}
}

View File

@ -0,0 +1,719 @@
#if WorkItems
using File_Folder_Helper.Day.Q32024.ConvertExcelToJson;
using File_Folder_Helper.ADO2024.PI3.WorkItems;
using File_Folder_Helper.Models;
#endif
using Microsoft.Extensions.Logging;
#if WorkItems
using Microsoft.TeamFoundation.WorkItemTracking.WebApi;
using Microsoft.TeamFoundation.WorkItemTracking.WebApi.Models;
using Microsoft.VisualStudio.Services.Common;
using Microsoft.VisualStudio.Services.WebApi;
using Microsoft.VisualStudio.Services.WebApi.Patch;
using Microsoft.VisualStudio.Services.WebApi.Patch.Json;
using System.Collections.ObjectModel;
using System.Globalization;
using System.Net.Http.Headers;
using System.Text;
using System.Text.Json;
using System.Web;
#endif
namespace File_Folder_Helper.ADO2024.PI3;
internal static partial class Helper20240809
{
#if WorkItems
private static void AddPatch(JsonPatchDocument document, string path, object value) =>
document.Add(new JsonPatchOperation { From = null, Operation = Operation.Add, Path = path, Value = value });
private static Dictionary<string, FIBacklogMesa> GetFIBacklogMesaCollection(string json)
{
Dictionary<string, FIBacklogMesa> results = [];
string key;
FIBacklogMesa[]? fiBacklogMesaCollection;
fiBacklogMesaCollection = JsonSerializer.Deserialize(json, FIBacklogMesaCollectionSourceGenerationContext.Default.FIBacklogMesaArray);
if (fiBacklogMesaCollection is null || fiBacklogMesaCollection.Length == 0)
throw new NullReferenceException();
foreach (FIBacklogMesa fiBacklogMesa in fiBacklogMesaCollection)
{
if (string.IsNullOrEmpty(fiBacklogMesa.Req))
continue;
if (string.IsNullOrEmpty(fiBacklogMesa.Submitted))
continue;
if (string.IsNullOrEmpty(fiBacklogMesa.Requestor))
continue;
key = $"{fiBacklogMesa.Req} - ";
if (results.ContainsKey(key))
continue;
results.Add(key, fiBacklogMesa);
}
return results;
}
private static string GetIds(HttpClient httpClient, string basePage, string api, string query)
{
StringBuilder result = new();
Task<HttpResponseMessage> httpResponseMessageTask = httpClient.GetAsync(string.Concat(basePage, api, query));
httpResponseMessageTask.Wait();
if (!httpResponseMessageTask.Result.IsSuccessStatusCode)
throw new Exception(httpResponseMessageTask.Result.StatusCode.ToString());
Task<Stream> streamTask = httpResponseMessageTask.Result.Content.ReadAsStreamAsync();
streamTask.Wait();
if (!streamTask.Result.CanRead)
throw new NullReferenceException(nameof(streamTask));
WIQL.Root? root = JsonSerializer.Deserialize(streamTask.Result, WIQL.WIQLRootSourceGenerationContext.Default.Root);
streamTask.Result.Dispose();
if (root is null || root.WorkItems is null)
throw new NullReferenceException(nameof(root));
foreach (WIQL.WorkItem workItem in root.WorkItems)
_ = result.Append(workItem.Id).Append(',');
if (result.Length > 0)
_ = result.Remove(result.Length - 1, 1);
return result.ToString();
}
private static ReadOnlyCollection<ValueWithReq> GetWorkItems(HttpClient httpClient, string basePage, string api, string sourceDirectory, string ids)
{
List<ValueWithReq> results = [];
int req;
string json;
string file;
Value? value;
string[] segments;
JsonElement[] jsonElements;
Task<HttpResponseMessage> httpResponseMessageTask = httpClient.GetAsync(string.Concat(basePage, api, $"/workitems?ids={ids}"));
httpResponseMessageTask.Wait();
if (!httpResponseMessageTask.Result.IsSuccessStatusCode)
throw new Exception(httpResponseMessageTask.Result.StatusCode.ToString());
Task<Stream> streamTask = httpResponseMessageTask.Result.Content.ReadAsStreamAsync();
streamTask.Wait();
if (!streamTask.Result.CanRead)
throw new NullReferenceException(nameof(streamTask));
JsonElement? result = JsonSerializer.Deserialize<JsonElement>(streamTask.Result);
if (result is null || result.Value.ValueKind != JsonValueKind.Object)
throw new NullReferenceException(nameof(result));
JsonProperty[] jsonProperties = result.Value.EnumerateObject().ToArray();
foreach (JsonProperty jsonProperty in jsonProperties)
{
if (jsonProperty.Value.ValueKind != JsonValueKind.Array)
continue;
jsonElements = jsonProperty.Value.EnumerateArray().ToArray();
foreach (JsonElement jsonElement in jsonElements)
{
json = jsonElement.GetRawText();
value = JsonSerializer.Deserialize(json, ValueSourceGenerationContext.Default.Value);
if (value is null)
continue;
segments = value.Fields.SystemTitle.Split('-');
if (segments.Length < 2)
continue;
if (!int.TryParse(segments[0], out req) || req == 0)
continue;
file = Path.Combine(sourceDirectory, $"{req}-{value.Id}.json");
File.WriteAllText(file, json);
results.Add(new(value, req, json));
}
}
return new(results);
}
private static ReadOnlyCollection<ValueWithReq> RemoveFrom(Dictionary<string, FIBacklogMesa> keyToFIBacklogMesa, ReadOnlyCollection<ValueWithReq> valueWithReqCollection)
{
List<ValueWithReq> results = [];
foreach (ValueWithReq valueWithReq in valueWithReqCollection)
{
if (keyToFIBacklogMesa.Remove($"{valueWithReq.Req} - "))
continue;
results.Add(valueWithReq);
}
return new(results);
}
private static ReadOnlyCollection<Models.Comment> GetComments(HttpClient httpClient, string basePage, string api, string sourceDirectory, int req, int id)
{
List<Models.Comment> results = [];
string json;
string file;
Models.Comment? comment;
JsonElement[] jsonElements;
Task<HttpResponseMessage> httpResponseMessageTask = httpClient.GetAsync(string.Concat(basePage, api, $"/workitems/{id}/comments?api-version=7.0-preview.3"));
httpResponseMessageTask.Wait();
if (!httpResponseMessageTask.Result.IsSuccessStatusCode)
throw new Exception(httpResponseMessageTask.Result.StatusCode.ToString());
Task<Stream> streamTask = httpResponseMessageTask.Result.Content.ReadAsStreamAsync();
streamTask.Wait();
if (!streamTask.Result.CanRead)
throw new NullReferenceException(nameof(streamTask));
JsonElement? result = JsonSerializer.Deserialize<JsonElement>(streamTask.Result);
if (result is null || result.Value.ValueKind != JsonValueKind.Object)
throw new NullReferenceException(nameof(result));
JsonProperty[] jsonProperties = result.Value.EnumerateObject().ToArray();
foreach (JsonProperty jsonProperty in jsonProperties)
{
if (jsonProperty.Value.ValueKind != JsonValueKind.Array)
continue;
jsonElements = jsonProperty.Value.EnumerateArray().ToArray();
foreach (JsonElement jsonElement in jsonElements)
{
json = jsonElement.GetRawText();
comment = JsonSerializer.Deserialize(jsonElement, CommentSourceGenerationContext.Default.Comment);
if (comment is null || comment.WorkItemId is null || comment.Id is null)
continue;
file = Path.Combine(sourceDirectory, $"{req}-{id}-{comment.Id}-comments.json");
File.WriteAllText(file, json);
results.Add(comment);
}
}
return new(results);
}
private static void UpdateComment(HttpClient httpClient,
string basePage,
string api,
int id,
FIBacklogMesa fiBacklogMesa,
Models.Comment comment)
{
DateTime submittedDateTime;
if (!DateTime.TryParse(fiBacklogMesa.Submitted, out submittedDateTime))
submittedDateTime = DateTime.MinValue;
string updatesWithSubmitted = $"{fiBacklogMesa.Updates}<br>&nbsp;<br>{submittedDateTime:MM/dd/yyyy} - Submitted by {fiBacklogMesa.Requestor}";
string json = JsonSerializer.Serialize(new { text = updatesWithSubmitted });
StringContent stringContent = new(json, Encoding.UTF8, "application/json");
string requestUri = string.Concat(basePage, api, $"/workitems/{id}/comments/{comment.Id}?api-version=7.0-preview.3");
Task<HttpResponseMessage> httpResponseMessageTask = httpClient.PatchAsync(requestUri, stringContent);
httpResponseMessageTask.Wait();
if (!httpResponseMessageTask.Result.IsSuccessStatusCode)
throw new Exception(httpResponseMessageTask.Result.StatusCode.ToString());
Task<Stream> streamTask = httpResponseMessageTask.Result.Content.ReadAsStreamAsync();
streamTask.Wait();
if (!streamTask.Result.CanRead)
throw new NullReferenceException(nameof(streamTask));
JsonElement? result = JsonSerializer.Deserialize<JsonElement>(streamTask.Result);
if (result is null || result.Value.ValueKind != JsonValueKind.Object)
throw new NullReferenceException(nameof(result));
}
private static void UpdateAllWorkItemsPharesComment(HttpClient httpClient,
string basePage,
string api,
string sourceDirectory,
Dictionary<string, FIBacklogMesa> keyToFIBacklogMesa,
ReadOnlyCollection<ValueWithReq> valueWithReqCollection)
{
string key;
FIBacklogMesa? fIBacklogMesa;
ReadOnlyCollection<Models.Comment> comments;
foreach (ValueWithReq valueWithReq in valueWithReqCollection)
{
if (valueWithReq.Value.Fields.SystemCommentCount == 0)
continue;
key = $"{valueWithReq.Req} - ";
if (!keyToFIBacklogMesa.TryGetValue(key, out fIBacklogMesa))
continue;
comments = GetComments(httpClient, basePage, api, sourceDirectory, valueWithReq.Req, valueWithReq.Value.Id);
foreach (Models.Comment comment in comments)
{
if (comment.CreatedBy?.UniqueName is null || !comment.CreatedBy.UniqueName.Contains("Phares", StringComparison.CurrentCultureIgnoreCase))
continue;
UpdateComment(httpClient, basePage, api, valueWithReq.Value.Id, fIBacklogMesa, comment);
}
}
}
private static void UpdateIteration(HttpClient httpClient,
string basePage,
string api,
int id,
int rev)
{
string json = /*lang=json,strict*/ string.Concat("[ { \"op\": \"test\", \"path\": \"/rev\", \"value\": ", rev, " }, { \"op\": \"replace\", \"path\": \"/fields/System.IterationPath\", \"value\": \"ART SPS\" } ]");
StringContent stringContent = new(json, Encoding.UTF8, "application/json-patch+json");
string requestUri = string.Concat(basePage, api, $"/workitems/{id}?api-version=1.0");
Task<HttpResponseMessage> httpResponseMessageTask = httpClient.PatchAsync(requestUri, stringContent);
httpResponseMessageTask.Wait();
if (!httpResponseMessageTask.Result.IsSuccessStatusCode)
throw new Exception(httpResponseMessageTask.Result.StatusCode.ToString());
Task<Stream> streamTask = httpResponseMessageTask.Result.Content.ReadAsStreamAsync();
streamTask.Wait();
if (!streamTask.Result.CanRead)
throw new NullReferenceException(nameof(streamTask));
JsonElement? result = JsonSerializer.Deserialize<JsonElement>(streamTask.Result);
if (result is null || result.Value.ValueKind != JsonValueKind.Object)
throw new NullReferenceException(nameof(result));
}
private static void UpdateAllFeaturesNotArtSPS(HttpClient httpClient,
string basePage,
string api,
ReadOnlyCollection<ValueWithReq> valueWithReqCollection)
{
foreach (ValueWithReq valueWithReq in valueWithReqCollection)
{
if (valueWithReq.Value.Fields.SystemIterationPath != "ART SPS\\2024")
continue;
UpdateIteration(httpClient, basePage, api, valueWithReq.Value.Id, valueWithReq.Value.Rev);
}
}
private static DateTime? GetCommitDate(FIBacklogMesa fiBacklogMesa)
{
DateTime? result;
DateTime dateTime;
DateTime minDateTime = DateTime.MinValue.AddYears(10);
string commitDate = fiBacklogMesa.CommitDate.Split(' ')[0];
if (string.IsNullOrEmpty(commitDate))
result = null;
else
{
if (DateTime.TryParseExact(commitDate, "MM/dd/yyyy", CultureInfo.InvariantCulture, DateTimeStyles.None, out dateTime) && dateTime >= minDateTime)
result = dateTime.AddHours(12).ToUniversalTime();
else
{
if (DateTime.TryParseExact(commitDate, "dd-MMM-yy", CultureInfo.InvariantCulture, DateTimeStyles.None, out dateTime) && dateTime >= minDateTime)
result = dateTime.AddHours(12).ToUniversalTime();
else
{
if (DateTime.TryParse(commitDate, CultureInfo.InvariantCulture, DateTimeStyles.None, out dateTime) && dateTime >= minDateTime)
result = dateTime.AddHours(12).ToUniversalTime();
else
result = null;
}
}
}
return result;
}
private static int GetPriority(FIBacklogMesa fiBacklogMesa)
{
int result;
if (string.IsNullOrEmpty(fiBacklogMesa.Priority) || !int.TryParse(fiBacklogMesa.Priority[..1], out int priority) || priority == 0 || priority > 3)
result = 4;
else
result = priority;
return result;
}
private static int? GetPrioritySubset(FIBacklogMesa fiBacklogMesa)
{
int? result;
if (string.IsNullOrEmpty(fiBacklogMesa.PrioritySubset) || !int.TryParse(fiBacklogMesa.PrioritySubset[..1], out int prioritySubset) || prioritySubset == 0 || prioritySubset > 3)
result = null;
else
result = prioritySubset;
return result;
}
private static string GetIterationPath(string project, DateTime submittedDateTime) =>
submittedDateTime.Year != 2024 ? project : string.Concat(project, "\\", submittedDateTime.Year);
private static string GetTitle(FIBacklogMesa fiBacklogMesa)
{
string result = $"{fiBacklogMesa.Req} - {fiBacklogMesa.Subject.Split(new string[] { Environment.NewLine }, StringSplitOptions.None)[0]}";
if (result.Length > 128)
result = result[..127];
return result;
}
private static string GetTitle(FIBacklogMesa fiBacklogMesa, ValueWithReq valueWithReq)
{
string result = $"{valueWithReq.Req} - {fiBacklogMesa.Subject.Split(new string[] { Environment.NewLine }, StringSplitOptions.None)[0]}";
if (result.Length > 128)
result = result[..127];
return result;
}
private static string? GetMappedState(FIBacklogMesa fiBacklogMesa) =>
fiBacklogMesa.Status == "CMP" ? "Closed" : fiBacklogMesa.Status == "UAT" ? "Resolved" : fiBacklogMesa.Status == "In process" ? "Active" : null;
private static JsonPatchDocument GetBugDocument(string project, string site, ReadOnlyDictionary<string, string> assignedToNameToUser, ReadOnlyDictionary<string, string> requestorNameToUser, Task<WorkItem>? uatWorkItemTask, FIBacklogMesa fiBacklogMesa, DateTime submittedDateTime)
{
JsonPatchDocument result = [];
string title = GetTitle(fiBacklogMesa);
string iterationPath = GetIterationPath(project, submittedDateTime);
if (uatWorkItemTask?.Result.Id is not null)
AddPatch(result, "/relations/-", new WorkItemRelation() { Rel = "System.LinkTypes.Hierarchy-Forward", Url = uatWorkItemTask.Result.Url });
AddPatch(result, "/fields/System.AreaPath", string.Concat(project, "\\", site));
AddPatch(result, "/fields/System.IterationPath", iterationPath);
AddPatch(result, "/fields/System.Title", title);
AddPatch(result, "/fields/System.CreatedDate", submittedDateTime.AddHours(12).ToUniversalTime());
string? state = GetMappedState(fiBacklogMesa);
if (!string.IsNullOrEmpty(state))
AddPatch(result, "/fields/System.State", state);
if (!string.IsNullOrEmpty(fiBacklogMesa.Definition))
AddPatch(result, "/fields/System.Description", $"{fiBacklogMesa.Subject}<br>&nbsp;<br>{fiBacklogMesa.Definition}");
if (assignedToNameToUser.TryGetValue(fiBacklogMesa.AssignedTo, out string? assignedToUser))
AddPatch(result, "/fields/System.AssignedTo", assignedToUser);
if (requestorNameToUser.TryGetValue(fiBacklogMesa.Requestor, out string? requestorUser))
AddPatch(result, "/fields/Custom.Requester", requestorUser);
return result;
}
private static JsonPatchDocument GetFeatureDocument(string project, string site, ReadOnlyDictionary<string, string> requestorNameToUser, ReadOnlyDictionary<string, string> assignedToNameToUser, List<string> tags, Task<WorkItem>? userStoryWorkItemTask, FIBacklogMesa fiBacklogMesa, DateTime submittedDateTime)
{
JsonPatchDocument result = [];
string title = GetTitle(fiBacklogMesa);
int priority = GetPriority(fiBacklogMesa);
string? state = GetMappedState(fiBacklogMesa);
int? prioritySubset = GetPrioritySubset(fiBacklogMesa);
if (prioritySubset is not null)
AddPatch(result, "/fields/Microsoft.VSTS.Common.TimeCriticality", prioritySubset);
string iterationPath = GetIterationPath(project, submittedDateTime);
if (userStoryWorkItemTask?.Result.Id is not null)
AddPatch(result, "/relations/-", new WorkItemRelation() { Rel = "System.LinkTypes.Hierarchy-Forward", Url = userStoryWorkItemTask.Result.Url });
AddPatch(result, "/fields/System.AreaPath", string.Concat(project, "\\", site));
if (tags.Count > 0)
{
AddPatch(result, "/fields/System.Tags", tags.Last());
tags.RemoveAt(tags.Count - 1);
}
AddPatch(result, "/fields/System.IterationPath", iterationPath);
AddPatch(result, "/fields/Microsoft.VSTS.Common.Priority", priority);
if (!string.IsNullOrEmpty(fiBacklogMesa.Definition))
AddPatch(result, "/fields/System.Description", $"{fiBacklogMesa.Subject}<br>&nbsp;<br>{fiBacklogMesa.Definition}");
if (!string.IsNullOrEmpty(state))
AddPatch(result, "/fields/System.State", state);
if (!string.IsNullOrEmpty(fiBacklogMesa.EstEffortDays) && int.TryParse(fiBacklogMesa.EstEffortDays, out int estEffortDays) && estEffortDays != 0)
AddPatch(result, "/fields/Microsoft.VSTS.Scheduling.Effort", estEffortDays);
DateTime? dateTime = GetCommitDate(fiBacklogMesa);
if (dateTime is not null)
AddPatch(result, "/fields/Microsoft.VSTS.Scheduling.TargetDate", dateTime);
if (!string.IsNullOrEmpty(fiBacklogMesa.Updates))
AddPatch(result, "/fields/System.History", fiBacklogMesa.Updates);
AddPatch(result, "/fields/System.Title", title);
AddPatch(result, "/fields/System.CreatedDate", submittedDateTime.AddHours(12).ToUniversalTime());
if (assignedToNameToUser.TryGetValue(fiBacklogMesa.AssignedTo, out string? assignedToUser))
AddPatch(result, "/fields/System.AssignedTo", assignedToUser);
if (requestorNameToUser.TryGetValue(fiBacklogMesa.Requestor, out string? requestorUser))
AddPatch(result, "/fields/Custom.Requester", requestorUser);
// https://tfs.intra.infineon.com/tfs/ManufacturingIT/Mesa_FI/_apis/wit/workitemtypes/feature/fields?api-version=7.0
return result;
}
private static List<string> GetTags(FIBacklogMesa fiBacklogMesa)
{
List<string> results = [];
foreach (string tag in fiBacklogMesa.SystemS.Split('/'))
{
if (string.IsNullOrEmpty(tag.Trim()))
continue;
results.Add(tag.Trim());
}
return results;
}
private static List<string> GetTags(Fields fields)
{
List<string> results = [];
if (!string.IsNullOrEmpty(fields.SystemTags))
{
foreach (string tag in fields.SystemTags.Split(';'))
{
if (string.IsNullOrEmpty(tag.Trim()))
continue;
results.Add(tag.Trim());
}
}
return results;
}
private static void CreateWorkItem(WorkItemTrackingHttpClient workItemTrackingHttpClient, string project, string site, ReadOnlyDictionary<string, string> assignedToNameToUser, ReadOnlyDictionary<string, string> requestorNameToUser, FIBacklogMesa fiBacklogMesa)
{
DateTime submittedDateTime;
JsonPatchDocument tagDocument;
Task<WorkItem>? workItem = null;
List<string> tags = GetTags(fiBacklogMesa);
bool isBugFix = fiBacklogMesa.Priority == "0 - BugFix";
if (assignedToNameToUser.Count > requestorNameToUser.Count)
throw new Exception();
if (!DateTime.TryParse(fiBacklogMesa.Submitted, out submittedDateTime))
submittedDateTime = DateTime.MinValue;
if (isBugFix)
{
Task<WorkItem>? uatWorkItemTask = null;
JsonPatchDocument bugDocument = GetBugDocument(project, site, requestorNameToUser, assignedToNameToUser, uatWorkItemTask, fiBacklogMesa, submittedDateTime);
workItem = workItemTrackingHttpClient.CreateWorkItemAsync(bugDocument, project, "Bug");
workItem.Wait();
}
if (!isBugFix)
{
Task<WorkItem>? userStoryWorkItemTask = null;
JsonPatchDocument featureDocument = GetFeatureDocument(project, site, requestorNameToUser, assignedToNameToUser, tags, userStoryWorkItemTask, fiBacklogMesa, submittedDateTime);
workItem = workItemTrackingHttpClient.CreateWorkItemAsync(featureDocument, project, "Feature");
workItem.Wait();
}
for (int i = tags.Count - 1; i > -1; i--)
{
if (workItem is null)
continue;
if (workItem.Result.Id is null)
throw new NotSupportedException();
tagDocument = [];
AddPatch(tagDocument, "/fields/System.Tags", tags[i]);
tags.RemoveAt(i);
workItem = workItemTrackingHttpClient.UpdateWorkItemAsync(tagDocument, workItem.Result.Id.Value);
workItem.Wait();
}
}
private static void KillTime(int loops)
{
for (int i = 1; i < loops; i++)
Thread.Sleep(500);
}
private static void Update(HttpClient httpClient, string basePage, string api, string query, HttpContent httpContent)
{
#if Windows
Task<HttpResponseMessage> httpResponseMessageTask = httpClient.PatchAsync(string.Concat(basePage, api, query), httpContent);
httpResponseMessageTask.Wait();
if (!httpResponseMessageTask.Result.IsSuccessStatusCode)
throw new Exception(httpResponseMessageTask.Result.StatusCode.ToString());
Task<string> stringTask = httpResponseMessageTask.Result.Content.ReadAsStringAsync();
stringTask.Wait();
#endif
KillTime(30);
}
private static void Update(HttpClient httpClient, string basePage, string api, WorkItemTrackingHttpClient workItemTrackingHttpClient, string sync, ValueWithReq valueWithReq)
{
JsonPatchDocument result = [];
AddPatch(result, "/fields/System.Tags", sync);
Task<WorkItem> workItem = workItemTrackingHttpClient.UpdateWorkItemAsync(result, valueWithReq.Value.Id);
workItem.Wait();
if (result is null)
{
var payload = new
{
op = "replace",
path = "/fields/System.IterationPath",
value = "Mesa_FI"
};
string stringPayload = JsonSerializer.Serialize(payload);
HttpContent httpContent = new StringContent($"[{stringPayload}]", Encoding.UTF8, "application/json-patch+json");
Update(httpClient, basePage, api, $"/workitems/{valueWithReq.Value.Id}?api-version=1.0", httpContent);
}
}
private static int SetSyncTag(HttpClient httpClient,
string basePage,
string api,
WorkItemTrackingHttpClient workItemTrackingHttpClient,
ReadOnlyDictionary<string, string> assignedToNameToUser,
ReadOnlyDictionary<string, string> requestorNameToUser,
Dictionary<string, FIBacklogMesa> keyToFIBacklogMesa,
ReadOnlyCollection<ValueWithReq> valueWithReqCollection)
{
int result = 0;
string key;
string title;
int priority;
bool isBugFix;
string? state;
List<string> tags;
TimeSpan timeSpan;
DateTime? dateTime;
List<string> compareTags;
const string sync = "Sync";
FIBacklogMesa? fiBacklogMesa;
foreach (ValueWithReq valueWithReq in valueWithReqCollection)
{
key = $"{valueWithReq.Req} - ";
if (!string.IsNullOrEmpty(key)) // Forced to skip logic
continue;
compareTags = GetTags(valueWithReq.Value.Fields);
if (compareTags.Contains(sync))
continue;
if (!keyToFIBacklogMesa.TryGetValue(key, out fiBacklogMesa))
continue;
tags = GetTags(fiBacklogMesa);
title = GetTitle(fiBacklogMesa, valueWithReq);
isBugFix = fiBacklogMesa.Priority == "0 - BugFix";
_ = requestorNameToUser.TryGetValue(fiBacklogMesa.Requestor, out string? requestorUser);
if (!string.IsNullOrEmpty(requestorUser) && (valueWithReq.Value.Fields.CustomRequester is null || !valueWithReq.Value.Fields.CustomRequester.UniqueName.Equals(requestorUser, StringComparison.CurrentCultureIgnoreCase)))
{
result += 1;
Update(httpClient, basePage, api, workItemTrackingHttpClient, sync, valueWithReq);
continue;
}
_ = assignedToNameToUser.TryGetValue(fiBacklogMesa.Requestor, out string? assignedToUser);
if (!string.IsNullOrEmpty(assignedToUser) && (valueWithReq.Value.Fields.SystemAssignedTo is null || !valueWithReq.Value.Fields.SystemAssignedTo.UniqueName.Equals(assignedToUser, StringComparison.CurrentCultureIgnoreCase)))
{
result += 1;
Update(httpClient, basePage, api, workItemTrackingHttpClient, sync, valueWithReq);
continue;
}
if (valueWithReq.Value.Fields.SystemTitle != title)
{
result += 1;
Update(httpClient, basePage, api, workItemTrackingHttpClient, sync, valueWithReq);
continue;
}
foreach (string tag in tags)
{
if (compareTags.Contains(tag))
continue;
_ = tags.Remove(tag);
break;
}
if (tags.Count != compareTags.Count)
{
result += 1;
Update(httpClient, basePage, api, workItemTrackingHttpClient, sync, valueWithReq);
continue;
}
if ((isBugFix && valueWithReq.Value.Fields.SystemWorkItemType != "Bug") || (!isBugFix && valueWithReq.Value.Fields.SystemWorkItemType == "Bug"))
{
result += 1;
Update(httpClient, basePage, api, workItemTrackingHttpClient, sync, valueWithReq);
continue;
}
if (!isBugFix)
{
priority = GetPriority(fiBacklogMesa);
if (valueWithReq.Value.Fields.MicrosoftVSTSCommonPriority != priority)
{
result += 1;
Update(httpClient, basePage, api, workItemTrackingHttpClient, sync, valueWithReq);
continue;
}
}
state = GetMappedState(fiBacklogMesa);
if (!string.IsNullOrEmpty(state) && valueWithReq.Value.Fields.SystemState != state)
{
result += 1;
Update(httpClient, basePage, api, workItemTrackingHttpClient, sync, valueWithReq);
continue;
}
if (!isBugFix && int.TryParse(fiBacklogMesa.EstEffortDays, out int estEffortDays) && valueWithReq.Value.Fields.MicrosoftVSTSSchedulingEffort != estEffortDays)
{
result += 1;
Update(httpClient, basePage, api, workItemTrackingHttpClient, sync, valueWithReq);
continue;
}
dateTime = GetCommitDate(fiBacklogMesa);
if (dateTime is not null)
{
timeSpan = new(valueWithReq.Value.Fields.MicrosoftVSTSSchedulingTargetDate.Ticks - dateTime.Value.Ticks);
if (timeSpan.Hours is > 32 or < -32)
{
result += 1;
Update(httpClient, basePage, api, workItemTrackingHttpClient, sync, valueWithReq);
continue;
}
}
}
return result;
}
private static void CreateWorkItems(HttpClient httpClient,
string sourceDirectory,
string basePage,
string api,
string query,
WorkItemTrackingHttpClient workItemTrackingHttpClient,
string project,
string site,
ReadOnlyDictionary<string, string> assignedToNameToUser,
ReadOnlyDictionary<string, string> requestorNameToUser,
string json)
{
int counter = 0;
string ids = GetIds(httpClient, basePage, api, query);
Dictionary<string, FIBacklogMesa> keyToFIBacklogMesa = GetFIBacklogMesaCollection(json);
ReadOnlyCollection<ValueWithReq> valueWithReqCollection = string.IsNullOrEmpty(ids) ? new([]) : GetWorkItems(httpClient, basePage, api, sourceDirectory, ids);
int updated = SetSyncTag(httpClient, basePage, api, workItemTrackingHttpClient, assignedToNameToUser, requestorNameToUser, keyToFIBacklogMesa, valueWithReqCollection);
if (updated == 0)
{
UpdateAllFeaturesNotArtSPS(httpClient, basePage, api, valueWithReqCollection);
UpdateAllWorkItemsPharesComment(httpClient, basePage, api, sourceDirectory, keyToFIBacklogMesa, valueWithReqCollection);
ReadOnlyCollection<ValueWithReq> extra = RemoveFrom(keyToFIBacklogMesa, valueWithReqCollection);
foreach (KeyValuePair<string, FIBacklogMesa> keyValuePair in keyToFIBacklogMesa)
{
if (keyToFIBacklogMesa.Count == extra.Count)
break;
if (keyValuePair.Value.Status is "CMP" or "CNCL")
continue;
CreateWorkItem(workItemTrackingHttpClient, project, site, assignedToNameToUser, requestorNameToUser, keyValuePair.Value);
counter++;
}
}
}
private static void CreateWorkItems(ILogger<Worker> logger, string sourceDirectory, string api, string site, string query, string project, string basePage, string baseAddress, byte[] bytes, string[] assignedToNames, string[] requestorNames, string reportFullPath, MediaTypeWithQualityHeaderValue mediaTypeWithQualityHeaderValue, WorkItemTrackingHttpClient workItemTrackingHttpClient, HttpClient httpClient)
{
string base64 = Convert.ToBase64String(bytes);
string json = File.ReadAllText(reportFullPath);
httpClient.DefaultRequestHeaders.Authorization = new("Basic", base64);
httpClient.DefaultRequestHeaders.Accept.Add(mediaTypeWithQualityHeaderValue);
ReadOnlyDictionary<string, string> requestorNameToUser = GetRequestorNameToUser(requestorNames);
ReadOnlyDictionary<string, string> assignedToNameToUser = GetAssignedToNameToUser(assignedToNames);
logger.LogInformation("{baseAddress}{basePage}/{project}{api}{query}", baseAddress, basePage, HttpUtility.HtmlEncode(project), api, query);
CreateWorkItems(httpClient, sourceDirectory, basePage, api, query, workItemTrackingHttpClient, project, site, new(assignedToNameToUser), new(requestorNameToUser), json);
}
private static ReadOnlyDictionary<string, string> GetAssignedToNameToUser(string[] assignedToNames)
{
Dictionary<string, string> results = [];
string[] segments;
foreach (string assignedToName in assignedToNames)
{
segments = assignedToName.Split('|');
if (segments.Length != 2)
continue;
results.Add(segments[0], segments[1]);
}
return new(results);
}
private static ReadOnlyDictionary<string, string> GetRequestorNameToUser(string[] requestorNames)
{
Dictionary<string, string> results = [];
string[] segments;
foreach (string requestorName in requestorNames)
{
segments = requestorName.Split('|');
if (segments.Length != 2)
continue;
results.Add(segments[0], segments[1]);
}
return new(results);
}
internal static void CreateWorkItems(ILogger<Worker> logger, List<string> args)
{
string api = args[6];
string pat = args[8];
string site = args[2];
string query = args[7];
string project = args[5];
string basePage = args[4];
string baseAddress = args[3];
string sourceDirectory = args[0];
VssBasicCredential credential = new("", pat);
string[] requestorNames = args[11].Split(',');
string[] assignedToNames = args[10].Split(',');
byte[] bytes = Encoding.ASCII.GetBytes($":{pat}");
string reportFullPath = Path.GetFullPath(Path.Combine(sourceDirectory, args[9]));
VssConnection connection = new(new(string.Concat(baseAddress, basePage)), credential);
MediaTypeWithQualityHeaderValue mediaTypeWithQualityHeaderValue = new("application/json");
WorkItemTrackingHttpClient workItemTrackingHttpClient = connection.GetClient<WorkItemTrackingHttpClient>();
HttpClient httpClient = new(new HttpClientHandler() { UseDefaultCredentials = true }) { BaseAddress = new(baseAddress) };
CreateWorkItems(logger, sourceDirectory, api, site, query, project, basePage, baseAddress, bytes, assignedToNames, requestorNames, reportFullPath, mediaTypeWithQualityHeaderValue, workItemTrackingHttpClient, httpClient);
}
#else
internal static void CreateWorkItems(ILogger<Worker> logger, List<string> args)
{
logger.LogError("CreateWorkItems is not available in WorkItems {args[0]}", args[0]);
logger.LogError("CreateWorkItems is not available in WorkItems {args[1]}", args[1]);
}
#endif
}

View File

@ -0,0 +1,40 @@
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2024.PI3;
internal static partial class Helper20240820
{
internal static void MoveFilesWithSleep(ILogger<Worker> logger, List<string> args)
{
string checkFile;
string checkDirectory;
int sleep = int.Parse(args[4]);
string searchPattern = args[3];
string sourceDirectory = args[0];
string destinationDirectory = args[2];
string source = Path.GetFullPath(sourceDirectory);
FileInfo[] collection = Directory.GetFiles(source, "*", SearchOption.TopDirectoryOnly).Select(l => new FileInfo(l)).ToArray();
string[] files = (from l in collection orderby l.LastWriteTime select l.FullName).ToArray();
logger.LogInformation("With search pattern '{SearchPattern}' found {files}", searchPattern, files.Length);
foreach (string file in files)
{
Thread.Sleep(500);
checkFile = file.Replace(source, destinationDirectory);
if (checkFile == file)
throw new NotSupportedException("Replace failed!");
checkDirectory = Path.GetDirectoryName(checkFile) ?? throw new NotSupportedException();
try
{
if (!Directory.Exists(checkDirectory))
_ = Directory.CreateDirectory(checkDirectory);
if (File.Exists(checkFile))
continue;
File.Move(file, checkFile);
Thread.Sleep(sleep);
}
catch (Exception ex)
{ logger.LogInformation(ex, "Inner loop error!"); }
}
}
}

View File

@ -0,0 +1,228 @@
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
using System.Text.Json;
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3;
internal static partial class Helper20240822
{
private record Record(string? Title, ReadOnlyCollection<string> Tags, string? Completed);
private record Root([property: JsonPropertyName("headings")] Heading[] Headings,
[property: JsonPropertyName("lanes")] Lane[] Lanes);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Root))]
private partial class Helper20240822RootSourceGenerationContext : JsonSerializerContext
{
}
private record Welcome2([property: JsonPropertyName("headings")] Heading[] Headings,
[property: JsonPropertyName("lanes")] Lane[] Lanes);
private record Heading([property: JsonPropertyName("name")] string Name,
[property: JsonPropertyName("heading")] string HeadingHeading);
private record Lane([property: JsonPropertyName("name")] string Name,
[property: JsonPropertyName("columns")] Column[][] Columns);
private record Column([property: JsonPropertyName("id")] string Id,
[property: JsonPropertyName("name")] string Name,
[property: JsonPropertyName("description")] string Description,
[property: JsonPropertyName("metadata")] Metadata? Metadata,
[property: JsonPropertyName("subTasks")] SubTask[]? SubTasks,
[property: JsonPropertyName("relations")] object[] Relations,
[property: JsonPropertyName("comments")] Comment[] Comments,
[property: JsonPropertyName("column")] string ColumnColumn,
[property: JsonPropertyName("workload")] long Workload,
[property: JsonPropertyName("progress")] long Progress,
[property: JsonPropertyName("remainingWorkload")] long RemainingWorkload,
[property: JsonPropertyName("dueData")] DueData DueData);
private record Comment([property: JsonPropertyName("text")] string Text,
[property: JsonPropertyName("date")] DateTimeOffset Date);
private record DueData([property: JsonPropertyName("completed")] bool Completed,
[property: JsonPropertyName("completedDate")] object CompletedDate,
[property: JsonPropertyName("dueDate")] DateTimeOffset DueDate,
[property: JsonPropertyName("overdue")] bool Overdue,
[property: JsonPropertyName("dueDelta")] long DueDelta,
[property: JsonPropertyName("dueMessage")] string DueMessage);
private record Metadata([property: JsonPropertyName("assigned")] string Assigned,
[property: JsonPropertyName("created")] DateTimeOffset Created,
[property: JsonPropertyName("progress")] long? Progress,
[property: JsonPropertyName("started")] DateTimeOffset? Started,
[property: JsonPropertyName("status")] string? Status,
[property: JsonPropertyName("tags")] string[]? Tags,
[property: JsonPropertyName("type")] string? Type,
[property: JsonPropertyName("updated")] DateTimeOffset Updated,
[property: JsonPropertyName("due")] DateTimeOffset? Due,
[property: JsonPropertyName("completed")] DateTimeOffset? Completed);
private record SubTask([property: JsonPropertyName("text")] string Text,
[property: JsonPropertyName("completed")] bool Completed);
private static ReadOnlyCollection<ReadOnlyCollection<Record>> GetRecords(Column[][] columnCollection)
{
List<ReadOnlyCollection<Record>> results = [];
bool check;
int subTasks;
Column column;
int completed;
List<Record> row;
string? subtasks;
List<string> tags;
for (int i = 0; i < int.MaxValue; i++)
{
row = [];
check = false;
foreach (Column[] columns in columnCollection)
{
if (columns.Length <= i)
row.Add(new(null, new([]), null));
else
{
tags = [];
subTasks = 0;
completed = 0;
column = columns[i];
if (!check)
check = true;
if (column.Metadata?.Tags is not null && column.Metadata.Tags.Length != 0)
{
foreach (string tag in column.Metadata.Tags)
tags.Add(tag);
}
if (column.SubTasks is not null && column.SubTasks.Length != 0)
{
foreach (SubTask subTask in column.SubTasks)
{
subTasks += 1;
if (subTask.Completed)
completed += 1;
}
}
subtasks = subTasks == 0 ? subtasks = null : $"{completed} / {subTasks}";
row.Add(new(column.Name, new(tags), subtasks));
}
}
if (!check)
break;
if (results.Count > 0)
{
if (results[0].Count != row.Count)
throw new Exception("Rows must match!");
}
results.Add(new(row));
}
return new(results);
}
private static void WriteFile(string destinationFile, Heading[] headings, ReadOnlyCollection<ReadOnlyCollection<Record>> recordCollection)
{
string title;
string completed;
List<string> lines =
[
"<html>",
"<head>",
"<style>",
":root {",
"color-scheme: light dark;",
"}",
"body {",
"color: light-dark(#333b3c, #efedea);",
"background-color: light-dark(#efedea, #333b3c);",
"}",
"td {",
"vertical-align: top;",
"}",
".title {",
"font-weight: bold;",
"}",
".complete {",
"font-size: small;",
"}",
".speech-bubble {",
"position: relative;",
"background: darkCyan;",
"border-radius: .4em;",
"}",
".speech-bubble:after {",
"content: '';",
"position: absolute;",
"bottom: 0;",
"left: 50%;",
"width: 0;",
"height: 0;",
"border: 2px solid transparent;",
"border-top-color: #00aabb;",
"border-bottom: 0;",
"margin-left: -2px;",
"margin-bottom: -2px;",
"}",
"</style>",
"</head>",
"<table border=\"1\">",
"<tr>"
];
foreach (Heading heading in headings)
lines.Add($"<th>{heading.Name}</th>");
lines.Add("</tr>");
foreach (ReadOnlyCollection<Record> records in recordCollection)
{
lines.Add("<tr>");
foreach (Record record in records)
{
lines.Add("<td>");
title = record.Title is null ? "&nbsp;" : record.Title;
completed = record.Completed is null ? "&nbsp;" : record.Completed;
lines.Add($"<div class=\"title\">{title}</div>");
lines.Add("<div>");
foreach (string tag in record.Tags)
lines.Add($"<span class=\"speech-bubble\">{tag}</span>");
lines.Add("</div>");
lines.Add($"<div class=\"completed\">{completed}</div>");
lines.Add("</td>");
}
lines.Add("</tr>");
}
lines.Add("</table>");
lines.Add("</html>");
File.WriteAllLines(destinationFile, lines);
}
private static void ParseKanbn(ILogger<Worker> logger, string destinationFile, string json)
{
Root? root = JsonSerializer.Deserialize(json, Helper20240822RootSourceGenerationContext.Default.Root);
if (root is null)
logger.LogInformation("<{root}> is null!", root);
else if (root.Lanes.Length != 1)
logger.LogInformation("{root.Lanes} != 1", root.Lanes.Length);
else if (root.Lanes[0].Columns.Length != root.Headings.Length)
logger.LogInformation("{root[0].Columns.Lanes} != {root.Headings}", root.Lanes[0].Columns.Length, root.Headings.Length);
else
{
ReadOnlyCollection<ReadOnlyCollection<Record>> recordCollection = GetRecords(root.Lanes[0].Columns);
WriteFile(destinationFile, root.Headings, recordCollection);
}
}
internal static void ParseKanbn(ILogger<Worker> logger, List<string> args)
{
string sourceDirectory = Path.GetFullPath(args[0]);
string sourceFile = Path.Combine(sourceDirectory, args[2]);
string destinationFile = Path.Combine(sourceDirectory, $"{DateTime.Now.Ticks}-{args[3]}");
if (!File.Exists(sourceFile))
logger.LogInformation("<{sourceFile}> doesn't exist!", sourceFile);
else
{
string json = File.ReadAllText(sourceFile);
ParseKanbn(logger, destinationFile, json);
}
}
}

View File

@ -0,0 +1,248 @@
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
using System.Globalization;
using System.Text.Json;
using System.Text.Json.Serialization;
using System.Text.RegularExpressions;
namespace File_Folder_Helper.ADO2024.PI3;
internal static partial class Helper20240828
{
private record HeaderCommon(DateTime Date,
string? Employee,
string? Layer,
string? MesEntity,
string? PSN,
string? Quantity,
string? RDS,
string? Reactor,
string? Recipe,
string? Zone);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(HeaderCommon))]
private partial class HeaderCommonSourceGenerationContext : JsonSerializerContext
{
}
private record Record(string? CassetteId,
ReadOnlyCollection<string>? CassetteSegments,
DateTime? Date,
string? Employee,
ReadOnlyCollection<string>? EquipmentSegments,
int I,
string? LastDate,
ReadOnlyCollection<ReadOnlyCollection<string>>? Matches);
private static Record GetRecord(int i, string? lastDate, ReadOnlyCollection<ReadOnlyCollection<string>> matches)
{
Record result;
if (matches.Count != 4 || matches[0].Count != 3 || matches[3].Count != 3)
result = new Record(null, null, null, null, null, i, null, null);
else
{
string[] equipmentSegments = matches[1][2].Split('|');
if (equipmentSegments.Length != 2)
result = new Record(null, null, null, null, null, i, null, null);
else
{
string[] cassetteIdSegments = matches[3][2].Split('|');
if (cassetteIdSegments.Length <= 3)
result = new Record(null, null, null, null, null, i, null, null);
else
{
string cassetteId = Regex.Replace(cassetteIdSegments[2], @"[\\,\/,\:,\*,\?,\"",\<,\>,\|]", "_").Split('\r')[0].Split('\n')[0];
result = new Record(cassetteId,
new(cassetteIdSegments),
DateTime.Parse(matches[0][0]),
matches[0][1],
new(equipmentSegments),
i,
lastDate,
new(matches));
}
}
}
return result;
}
private static ReadOnlyCollection<Record> GetRecords(string logDirectory, string logSearchPattern)
{
List<Record> results = [];
Record record;
string[] lines;
string[] logFiles = Directory.GetFiles(logDirectory, logSearchPattern, SearchOption.TopDirectoryOnly);
if (logFiles.Length > 0)
{ }
foreach (string logFile in new List<string>()) // logFiles)
{
lines = File.ReadAllLines(logFile);
for (int i = lines.Length - 1; i >= 0; i--)
{
record = GetRecord(lines, i);
i = record.I;
if (record.CassetteId is null || record.CassetteSegments is null || record.Date is null || record.Employee is null || record.EquipmentSegments is null || record.Matches is null)
{
if (i < 4)
break;
continue;
}
results.Add(record);
}
}
return new(results);
}
private static Dictionary<int, ReadOnlyCollection<Record>> GetKeyValuePairs(Dictionary<int, List<Record>> keyValuePairs)
{
Dictionary<int, ReadOnlyCollection<Record>> results = [];
foreach (KeyValuePair<int, List<Record>> keyValuePair in keyValuePairs)
results.Add(keyValuePair.Key, new(keyValuePair.Value));
return new(results);
}
private static Record GetRecord(string[] lines, int i)
{
Record result;
int ii = i;
string line;
string[] segments;
string? lastDate = null;
List<ReadOnlyCollection<string>> matches = [];
for (int j = i; j >= 0; j--)
{
ii = j;
line = lines[j];
segments = line.Split(',');
if (segments.Length < 2)
continue;
lastDate ??= segments[0];
if (segments[0] != lastDate)
{
lastDate = segments[0];
break;
}
matches.Add(new(segments));
}
result = GetRecord(ii + 1, lastDate, new(matches));
return result;
}
private static Dictionary<int, ReadOnlyCollection<Record>> GetKeyValuePairs(string logSearchPattern, string logDirectory)
{
Dictionary<int, ReadOnlyCollection<Record>> results;
int totalMinutes;
TimeSpan timeSpan;
List<Record>? collection;
Dictionary<int, List<Record>> keyValuePairs = [];
ReadOnlyCollection<Record> records = GetRecords(logDirectory, logSearchPattern);
foreach (Record record in records)
{
if (record.CassetteId is null || record.CassetteSegments is null || record.Date is null || record.Employee is null || record.EquipmentSegments is null || record.Matches is null)
continue;
timeSpan = TimeSpan.FromTicks(record.Date.Value.Ticks);
totalMinutes = (int)Math.Floor(timeSpan.TotalMinutes);
if (!keyValuePairs.TryGetValue(totalMinutes, out collection))
{
keyValuePairs.Add(totalMinutes, []);
if (!keyValuePairs.TryGetValue(totalMinutes, out collection))
throw new Exception();
}
collection.Add(record);
}
results = GetKeyValuePairs(keyValuePairs);
return results;
}
internal static void MoveWaferCounterToArchive(ILogger<Worker> logger, List<string> args)
{
string json;
string keyFile;
string? recipe;
string[] lines;
string checkFile;
string directory;
string? quantity;
string runDataSheet;
string checkDirectory;
HeaderCommon headerCommon;
string logDateFormat = args[3];
string wcSearchPattern = args[5];
string logSearchPattern = $"SKIP---{args[2]}";
string logDirectory = Path.GetFullPath(args[0]);
string sourceDirectory = Path.GetFullPath(args[4]);
string archiveDirectory = Path.GetFullPath(args[6]);
FileInfo[] collection = Directory.GetFiles(sourceDirectory, wcSearchPattern, SearchOption.AllDirectories).Select(l => new FileInfo(l)).ToArray();
logger.LogInformation("Found {collection}(s)", collection.Length);
foreach (FileInfo fileInfo in collection)
{
if (fileInfo.DirectoryName is null || !fileInfo.DirectoryName.Contains('-'))
continue;
lines = File.ReadAllLines(fileInfo.FullName);
recipe = lines.Length < 2 ? null : lines[1];
quantity = lines.Length < 1 ? null : lines[0];
keyFile = $"{fileInfo.FullName}.txt";
if (!File.Exists(keyFile))
continue;
lines = File.ReadAllLines(keyFile);
if (lines.Length != 1)
continue;
runDataSheet = Regex.Replace(lines[0], @"[\\,\/,\:,\*,\?,\"",\<,\>,\|]", ".").Split('\r')[0].Split('\n')[0];
directory = Path.Combine(fileInfo.DirectoryName, runDataSheet);
checkDirectory = Path.Combine(directory, fileInfo.LastWriteTime.Ticks.ToString());
if (!Directory.Exists(checkDirectory))
_ = Directory.CreateDirectory(checkDirectory);
checkFile = Path.Combine(directory, fileInfo.Name);
if (File.Exists(checkFile))
continue;
headerCommon = new(fileInfo.LastWriteTime, null, null, null, null, quantity, runDataSheet, null, recipe, null);
json = JsonSerializer.Serialize(headerCommon, HeaderCommonSourceGenerationContext.Default.HeaderCommon);
File.Move(fileInfo.FullName, checkFile);
File.Delete(keyFile);
checkFile = Path.Combine(checkDirectory, $"{fileInfo.Name}.json");
if (File.Exists(checkFile))
continue;
File.WriteAllText(checkFile, json);
Directory.SetLastWriteTime(checkDirectory, fileInfo.LastWriteTime);
}
Record record;
int totalMinutes;
string weekOfYear;
TimeSpan timeSpan;
ReadOnlyCollection<Record>? records;
Calendar calendar = new CultureInfo("en-US").Calendar;
Dictionary<int, ReadOnlyCollection<Record>> keyValuePairs = GetKeyValuePairs(logSearchPattern, logDirectory);
logger.LogInformation("Mapped {keyValuePairs}(s)", keyValuePairs.Count);
foreach (FileInfo fileInfo in collection)
{
if (fileInfo.DirectoryName is null || fileInfo.DirectoryName.Contains('-'))
continue;
timeSpan = TimeSpan.FromTicks(fileInfo.LastWriteTime.Ticks);
totalMinutes = (int)Math.Floor(timeSpan.TotalMinutes);
if (!keyValuePairs.TryGetValue(totalMinutes, out records))
continue;
if (records.Count != 1)
continue;
record = records[0];
if (record.CassetteId is null || record.CassetteSegments is null || record.Date is null || record.Employee is null || record.EquipmentSegments is null || record.Matches is null)
continue;
weekOfYear = $"{record.Date.Value.Year}_Week_{calendar.GetWeekOfYear(record.Date.Value, CalendarWeekRule.FirstDay, DayOfWeek.Sunday):00}";
checkDirectory = Path.Combine(archiveDirectory,
string.Join('-', record.EquipmentSegments.Reverse()),
weekOfYear,
record.Date.Value.ToString("yyyy-MM-dd"),
record.CassetteId);
if (!Directory.Exists(checkDirectory))
_ = Directory.CreateDirectory(checkDirectory);
checkFile = Path.Combine(checkDirectory, fileInfo.Name);
if (File.Exists(checkFile))
continue;
File.Move(fileInfo.FullName, checkFile);
lines = record.Matches.Select(l => string.Join(',', l)).ToArray();
File.WriteAllLines($"{checkFile}.txt", lines);
}
}
}

View File

@ -0,0 +1,250 @@
#if WorkItems
using File_Folder_Helper.ADO2024.PI3.WorkItems;
#endif
using Microsoft.Extensions.Logging;
#if WorkItems
using Microsoft.TeamFoundation.WorkItemTracking.WebApi;
using Microsoft.VisualStudio.Services.Common;
using Microsoft.VisualStudio.Services.WebApi;
using System.Collections.ObjectModel;
using System.Net.Http.Headers;
using System.Text;
using System.Text.Json;
using System.Web;
#endif
namespace File_Folder_Helper.ADO2024.PI3;
internal static partial class Helper20240830
{
#if WorkItems
private record WorkItem(string AreaPath,
string? AssignedTo,
int? BusinessValue,
DateTime ChangedDate,
DateTime? ClosedDate,
int CommentCount,
DateTime CreatedDate,
string Description,
float? Effort,
int Id,
string IterationPath,
int? Parent,
int? Priority,
object[] Relations,
string? Requester,
DateTime? ResolvedDate,
int Revision,
int? RiskReductionMinusOpportunityEnablement,
DateTime? StartDate,
string State,
string Tags,
DateTime? TargetDate,
float? TimeCriticality,
string Title,
string WorkItemType,
float? WeightedShortestJobFirst);
private static void CompareWorkItems(ILogger<Worker> logger, string sourceDirectory, string api, string site, string query, string project, string basePage, string baseAddress, byte[] bytes, MediaTypeWithQualityHeaderValue mediaTypeWithQualityHeaderValue, WorkItemTrackingHttpClient workItemTrackingHttpClient, HttpClient httpClient)
{
string base64 = Convert.ToBase64String(bytes);
httpClient.DefaultRequestHeaders.Authorization = new("Basic", base64);
httpClient.DefaultRequestHeaders.Accept.Add(mediaTypeWithQualityHeaderValue);
logger.LogInformation("{baseAddress}{basePage}/{project}{api}{query}", baseAddress, basePage, HttpUtility.HtmlEncode(project), api, query);
CompareWorkItems(httpClient, sourceDirectory, basePage, api, query, workItemTrackingHttpClient, project, site);
}
private static void GetSingle(HttpClient httpClient, string basePage, string api, string targetFileLocation, int id)
{
Task<HttpResponseMessage> httpResponseMessageTask = httpClient.GetAsync(string.Concat(basePage, api, $"/workitems/{id}?%24expand=1"));
httpResponseMessageTask.Wait();
if (!httpResponseMessageTask.Result.IsSuccessStatusCode)
throw new Exception(httpResponseMessageTask.Result.StatusCode.ToString());
Task<Stream> streamTask = httpResponseMessageTask.Result.Content.ReadAsStreamAsync();
streamTask.Wait();
if (!streamTask.Result.CanRead)
throw new NullReferenceException(nameof(streamTask));
JsonElement? result = JsonSerializer.Deserialize<JsonElement>(streamTask.Result);
string file = Path.Combine(targetFileLocation, $"{-9}-{id}.json");
File.WriteAllText(file, result.ToString());
}
internal static void CompareWorkItems(ILogger<Worker> logger, List<string> args)
{
string api = args[6];
string pat = args[8];
string site = args[2];
string query = args[7];
string project = args[5];
string basePage = args[4];
string baseAddress = args[3];
string sourceDirectory = args[0];
VssBasicCredential credential = new("", pat);
byte[] bytes = Encoding.ASCII.GetBytes($":{pat}");
VssConnection connection = new(new(string.Concat(baseAddress, basePage)), credential);
MediaTypeWithQualityHeaderValue mediaTypeWithQualityHeaderValue = new("application/json");
WorkItemTrackingHttpClient workItemTrackingHttpClient = connection.GetClient<WorkItemTrackingHttpClient>();
HttpClient httpClient = new(new HttpClientHandler() { UseDefaultCredentials = true }) { BaseAddress = new(baseAddress) };
CompareWorkItems(logger, sourceDirectory, api, site, query, project, basePage, baseAddress, bytes, mediaTypeWithQualityHeaderValue, workItemTrackingHttpClient, httpClient);
}
private static ReadOnlyCollection<WorkItem> GetWorkItems(ReadOnlyCollection<ValueWithReq> valueWithReqCollection)
{
List<WorkItem> results = [];
Fields fields;
WorkItem workItem;
foreach (ValueWithReq valueWithReq in valueWithReqCollection)
{
fields = valueWithReq.Value.Fields;
workItem = new(fields.SystemAreaPath,
fields.SystemAssignedTo?.DisplayName,
fields.MicrosoftVSTSCommonBusinessValue == 0 ? null : fields.MicrosoftVSTSCommonBusinessValue,
fields.SystemChangedDate,
fields.MicrosoftVSTSCommonClosedDate == DateTime.MinValue ? null : fields.MicrosoftVSTSCommonClosedDate,
fields.SystemCommentCount,
fields.SystemCreatedDate,
fields.SystemDescription,
fields.MicrosoftVSTSSchedulingEffort == 0 ? null : fields.MicrosoftVSTSSchedulingEffort,
valueWithReq.Value.Id,
fields.SystemIterationPath,
fields.SystemParent == 0 ? null : fields.SystemParent,
fields.MicrosoftVSTSCommonPriority == 0 ? null : fields.MicrosoftVSTSCommonPriority,
valueWithReq.Value.Relations,
fields.CustomRequester?.DisplayName,
fields.MicrosoftVSTSCommonResolvedDate == DateTime.MinValue ? null : fields.MicrosoftVSTSCommonResolvedDate,
valueWithReq.Value.Rev,
fields.CustomRRminusOE == 0 ? null : fields.CustomRRminusOE,
fields.MicrosoftVSTSSchedulingStartDate == DateTime.MinValue ? null : fields.MicrosoftVSTSSchedulingStartDate,
fields.SystemState,
fields.SystemTags,
fields.MicrosoftVSTSSchedulingTargetDate == DateTime.MinValue ? null : fields.MicrosoftVSTSSchedulingTargetDate,
fields.MicrosoftVSTSCommonTimeCriticality == 0 ? null : fields.MicrosoftVSTSCommonTimeCriticality,
fields.SystemTitle,
fields.SystemWorkItemType,
fields.CustomWSJF == 0 ? null : fields.CustomWSJF);
results.Add(workItem);
}
return new(results);
}
private static string GetIds(HttpClient httpClient, string basePage, string api, string query)
{
List<int> results = [];
StringBuilder result = new();
Task<HttpResponseMessage> httpResponseMessageTask = httpClient.GetAsync(string.Concat(basePage, api, query));
httpResponseMessageTask.Wait();
if (!httpResponseMessageTask.Result.IsSuccessStatusCode)
throw new Exception(httpResponseMessageTask.Result.StatusCode.ToString());
Task<Stream> streamTask = httpResponseMessageTask.Result.Content.ReadAsStreamAsync();
streamTask.Wait();
if (!streamTask.Result.CanRead)
throw new NullReferenceException(nameof(streamTask));
WIQL.Root? root = JsonSerializer.Deserialize(streamTask.Result, WIQL.WIQLRootSourceGenerationContext.Default.Root);
streamTask.Result.Dispose();
if (root is null || root.WorkItems is null)
throw new NullReferenceException(nameof(root));
foreach (WIQL.WorkItem workItem in root.WorkItems)
{
results.Add(workItem.Id);
if (results.Count > 199)
break;
}
foreach (int id in results)
_ = result.Append(id).Append(',');
if (result.Length > 0)
_ = result.Remove(result.Length - 1, 1);
return result.ToString();
}
private static ReadOnlyCollection<ValueWithReq> GetWorkItems(HttpClient httpClient, string basePage, string api, string targetFileLocation, string ids)
{
List<ValueWithReq> results = [];
string json;
string file;
Value? value;
JsonElement[] jsonElements;
Task<HttpResponseMessage> httpResponseMessageTask = httpClient.GetAsync(string.Concat(basePage, api, $"/workitems?ids={ids}&$expand=Relations"));
httpResponseMessageTask.Wait();
if (!httpResponseMessageTask.Result.IsSuccessStatusCode)
throw new Exception(httpResponseMessageTask.Result.StatusCode.ToString());
Task<Stream> streamTask = httpResponseMessageTask.Result.Content.ReadAsStreamAsync();
streamTask.Wait();
if (!streamTask.Result.CanRead)
throw new NullReferenceException(nameof(streamTask));
JsonElement? result = JsonSerializer.Deserialize<JsonElement>(streamTask.Result);
if (result is null || result.Value.ValueKind != JsonValueKind.Object)
throw new NullReferenceException(nameof(result));
JsonProperty[] jsonProperties = result.Value.EnumerateObject().ToArray();
foreach (JsonProperty jsonProperty in jsonProperties)
{
if (jsonProperty.Value.ValueKind != JsonValueKind.Array)
continue;
jsonElements = jsonProperty.Value.EnumerateArray().ToArray();
foreach (JsonElement jsonElement in jsonElements)
{
json = jsonElement.GetRawText();
value = JsonSerializer.Deserialize(json, ValueSourceGenerationContext.Default.Value);
if (value is null)
continue;
if (value.Id == 120593)
GetSingle(httpClient, basePage, api, targetFileLocation, value.Id);
file = Path.Combine(targetFileLocation, $"{-1}-{value.Id}.json");
File.WriteAllText(file, json);
results.Add(new(value, -1, json));
}
}
return new(results);
}
private static void CompareWorkItems(WorkItemTrackingHttpClient workItemTrackingHttpClient,
string sourceDirectory,
string project,
string site,
ReadOnlyCollection<ValueWithReq> valueWithReqCollection)
{
ArgumentNullException.ThrowIfNull(workItemTrackingHttpClient);
if (string.IsNullOrEmpty(project))
throw new ArgumentException($"'{nameof(project)}' cannot be null or empty.", nameof(project));
if (string.IsNullOrEmpty(sourceDirectory))
throw new ArgumentException($"'{nameof(sourceDirectory)}' cannot be null or empty.", nameof(site));
if (string.IsNullOrEmpty(site))
throw new ArgumentException($"'{nameof(site)}' cannot be null or empty.", nameof(site));
ReadOnlyCollection<WorkItem> workItems = GetWorkItems(valueWithReqCollection);
string file = Path.Combine(sourceDirectory, $"_.json");
string json = JsonSerializer.Serialize(workItems);
File.WriteAllText(file, json);
foreach (WorkItem workItem in workItems)
{
if (workItem is null)
{ }
}
// https://stackoverflow.com/questions/18153998/how-do-i-remove-all-html-tags-from-a-string-without-knowing-which-tags-are-in-it
}
private static void CompareWorkItems(HttpClient httpClient,
string targetFileLocation,
string basePage,
string api,
string query,
WorkItemTrackingHttpClient workItemTrackingHttpClient,
string project,
string site)
{
string ids = GetIds(httpClient, basePage, api, query);
ReadOnlyCollection<ValueWithReq> valueWithReqCollection = string.IsNullOrEmpty(ids) ? new([]) : GetWorkItems(httpClient, basePage, api, targetFileLocation, ids);
CompareWorkItems(workItemTrackingHttpClient, targetFileLocation, project, site, valueWithReqCollection);
}
#else
internal static void CompareWorkItems(ILogger<Worker> logger, List<string> args)
{
logger.LogError("CompareWorkItems is not available in WorkItems {args[0]}", args[0]);
logger.LogError("CompareWorkItems is not available in WorkItems {args[1]}", args[1]);
}
#endif
}

View File

@ -0,0 +1,65 @@
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
using System.Globalization;
namespace File_Folder_Helper.ADO2024.PI3;
internal static partial class Helper20240910
{
internal static void MoveFilesToWeekOfYear(ILogger<Worker> logger, List<string> args)
{
string day;
string year;
string yearB;
string yearC;
string checkFile;
FileInfo fileInfo;
string weekOfYear;
int weekOfYearValue;
string checkDirectory;
string searchPattern = args[2];
ReadOnlyCollection<string> directoryNames;
string sourceDirectory = Path.GetFullPath(args[0]);
Calendar calendar = new CultureInfo("en-US").Calendar;
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.AllDirectories);
logger.LogInformation("With search pattern '{SearchPattern}' found {files}", searchPattern, files.Length);
foreach (string file in files)
{
fileInfo = new(file);
if (string.IsNullOrEmpty(fileInfo.DirectoryName))
continue;
checkDirectory = string.Empty;
year = $"{fileInfo.LastWriteTime:yyyy}";
yearB = $"{fileInfo.LastWriteTime:yyyy}_Year";
day = fileInfo.LastWriteTime.ToString("yyyy-MM-dd");
directoryNames = Helpers.HelperDirectory.GetDirectoryNames(fileInfo.DirectoryName);
weekOfYearValue = calendar.GetWeekOfYear(fileInfo.LastWriteTime, CalendarWeekRule.FirstDay, DayOfWeek.Sunday);
yearC = weekOfYearValue < 27 ? $"{fileInfo.LastWriteTime:yyyy}_Year_A" : $"{fileInfo.LastWriteTime:yyyy}_Year_Z";
weekOfYear = $"{fileInfo.LastWriteTime.Year}_Week_{weekOfYearValue:00}";
foreach (string directoryName in directoryNames)
{
if (directoryName == year || directoryName == yearB || directoryName == yearC || directoryName == weekOfYear || directoryName == day)
continue;
checkDirectory = Path.Combine(checkDirectory, directoryName);
}
if (string.IsNullOrEmpty(checkDirectory))
continue;
checkDirectory = Path.Combine(checkDirectory, yearC, weekOfYear, day);
if (!Directory.Exists(checkDirectory))
_ = Directory.CreateDirectory(checkDirectory);
checkFile = Path.Combine(checkDirectory, fileInfo.Name);
if (checkFile.Length > 256 || checkFile == fileInfo.FullName)
continue;
try
{
if (File.Exists(checkFile))
continue;
File.Move(fileInfo.FullName, checkFile);
}
catch (Exception ex)
{ logger.LogInformation(ex, $"Inner loop error <{fileInfo.FullName}>!"); }
}
Helpers.HelperDeleteEmptyDirectories.DeleteEmptyDirectories(logger, sourceDirectory);
}
}

View File

@ -0,0 +1,776 @@
using Microsoft.Extensions.Logging;
#if CommonMark
using System.Collections.ObjectModel;
using System.Text.Json;
using System.Text.Json.Serialization;
#endif
namespace File_Folder_Helper.ADO2024.PI3;
internal static partial class Helper20240911
{
#if CommonMark
private record Attribute([property: JsonPropertyName("isLocked")] bool IsLocked,
[property: JsonPropertyName("name")] string Name);
private record Relation([property: JsonPropertyName("rel")] string Type,
[property: JsonPropertyName("url")] string URL,
[property: JsonPropertyName("attributes")] Attribute Attributes);
private record WorkItem(string AreaPath,
string? AssignedTo,
int? BusinessValue,
DateTime ChangedDate,
DateTime? ClosedDate,
int CommentCount,
DateTime CreatedDate,
string Description,
float? Effort,
int Id,
string IterationPath,
int? Parent,
int? Priority,
Relation[] Relations,
string? Requester,
DateTime? ResolvedDate,
int Revision,
int? RiskReductionMinusOpportunityEnablement,
DateTime? StartDate,
string State,
string Tags,
DateTime? TargetDate,
float? TimeCriticality,
string Title,
string? Violation,
float? WeightedShortestJobFirst,
string WorkItemType)
{
public override string ToString() => $"{Id} - {WorkItemType} - {Title}";
public static WorkItem Get(WorkItem workItem, string? violation) =>
new(workItem.AreaPath,
workItem.AssignedTo,
workItem.BusinessValue,
workItem.ChangedDate,
workItem.ClosedDate,
workItem.CommentCount,
workItem.CreatedDate,
workItem.Description,
workItem.Effort,
workItem.Id,
workItem.IterationPath,
workItem.Parent,
workItem.Priority,
workItem.Relations,
workItem.Requester,
workItem.ResolvedDate,
workItem.Revision,
workItem.RiskReductionMinusOpportunityEnablement,
workItem.StartDate,
workItem.State,
workItem.Tags,
workItem.TargetDate,
workItem.TimeCriticality,
workItem.Title,
workItem.Violation is null ? violation : workItem.Violation,
workItem.WeightedShortestJobFirst,
workItem.WorkItemType);
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(WorkItem[]))]
private partial class WorkItemCollectionSourceGenerationContext : JsonSerializerContext
{
}
private record Record(WorkItem WorkItem,
WorkItem? Parent,
ReadOnlyCollection<Record> Children);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Record[]))]
private partial class RecordCollectionCommonSourceGenerationContext : JsonSerializerContext
{
}
private static int? GetIdFromUrlIfChild(Relation relation)
{
int? result;
string[] segments = relation?.Attributes is null || relation.Attributes.Name != "Child" ? [] : relation.URL.Split('/');
if (segments.Length < 2)
result = null;
else
{
if (!int.TryParse(segments[^1], out int id))
result = null;
else
result = id;
}
return result;
}
private static ReadOnlyCollection<Record> GetKeyValuePairs(ReadOnlyDictionary<int, WorkItem> keyValuePairs, WorkItem workItem, List<bool> nests)
{
List<Record> results = [];
int? childId;
Record record;
nests.Add(true);
WorkItem? childWorkItem;
WorkItem? parentWorkItem;
List<WorkItem> collection = [];
ReadOnlyCollection<Record> records;
if (workItem.Relations is not null && workItem.Relations.Length > 0)
{
collection.Clear();
foreach (Relation relation in workItem.Relations)
{
childId = GetIdFromUrlIfChild(relation);
if (childId is not null && workItem.Parent is not null && relation?.URL is not null && relation.URL.Contains(workItem.Parent.Value.ToString()))
continue;
if (childId is null || !keyValuePairs.TryGetValue(childId.Value, out childWorkItem))
continue;
collection.Add(childWorkItem);
}
collection = (from l in collection orderby l.State != "Closed", l.Id select l).ToList();
foreach (WorkItem w in collection)
{
if (nests.Count > 99)
break;
if (w.Parent is null)
parentWorkItem = null;
else
_ = keyValuePairs.TryGetValue(w.Parent.Value, out parentWorkItem);
records = GetKeyValuePairs(keyValuePairs, w, nests);
record = new(w, parentWorkItem, records);
results.Add(record);
}
}
return new(results);
}
private static void AppendLines(List<char> spaces, List<string> lines, Record record, bool condensed, bool sprintOnly)
{
string line;
spaces.Add('\t');
WorkItem workItem;
foreach (Record child in record.Children)
{
workItem = child.WorkItem;
line = GetLine(spaces, workItem, child, condensed, sprintOnly).TrimEnd();
lines.Add(line);
AppendLines(spaces, lines, child, condensed, sprintOnly);
}
spaces.RemoveAt(0);
}
private static void AppendLines(string url, List<char> spaces, List<string> lines, ReadOnlyCollection<Record> records, string workItemType)
{
List<string> results = [];
string? maxIterationPath;
List<string> distinct = [];
foreach (Record record in records)
{
// if (record.WorkItem.Id != 109724)
// continue;
if (record.WorkItem.WorkItemType != workItemType)
continue;
results.Add($"## {record.WorkItem.AssignedTo} - {record.WorkItem.Id} - {record.WorkItem.Title}");
results.Add(string.Empty);
results.Add($"- [{record.WorkItem.Id}]({url}{record.WorkItem.Id})");
if (record.Children.Count == 0)
results.Add(string.Empty);
else
{
AppendLines(spaces, results, record, condensed: true, sprintOnly: false);
results.Add(string.Empty);
distinct.Clear();
AppendLines(spaces, distinct, record, condensed: false, sprintOnly: true);
if (distinct.Count > 1)
{
results.Add($"## Distinct Iteration Path(s) - {record.WorkItem.WorkItemType} - {record.WorkItem.AssignedTo} - {record.WorkItem.Id} - {record.WorkItem.Title} - {record.WorkItem.IterationPath}");
results.Add(string.Empty);
results.Add($"- [{record.WorkItem.Id}]({url}{record.WorkItem.Id})");
distinct.Sort();
distinct = (from l in distinct select l.Trim()).Distinct().ToList();
results.AddRange(distinct);
results.Add(string.Empty);
maxIterationPath = distinct.Max();
if (!string.IsNullOrEmpty(maxIterationPath) && maxIterationPath.Contains("] ") && maxIterationPath.Split(']')[1].Trim() != record.WorkItem.IterationPath)
{
results.Add($"### Sync to Distinct Max Iteration Path => {maxIterationPath} - {record.WorkItem.Id} - {record.WorkItem.Title}");
results.Add(string.Empty);
}
}
results.Add($"## Extended - {record.WorkItem.Id} - {record.WorkItem.Title}");
results.Add(string.Empty);
AppendLines(spaces, results, record, condensed: false, sprintOnly: false);
results.Add(string.Empty);
}
lines.AddRange(results);
results.Clear();
}
}
private static string GetClosed(WorkItem workItem) =>
workItem.State != "Closed" ? "[ ]" : "[x]";
private static string GetLine(List<char> spaces, WorkItem workItem, Record record, bool condensed, bool sprintOnly)
{
string result;
string closed = GetClosed(workItem);
result = sprintOnly ? $"\t- [ ] {workItem.IterationPath}" :
condensed ? $"{new string(spaces.Skip(1).ToArray())}- {closed} {record.WorkItem.Id} - {workItem.Title}" :
$"{new string(spaces.Skip(1).ToArray())}- {closed} {record.WorkItem.Id} - {workItem.Title} ~~~ {workItem.AssignedTo} - {workItem.IterationPath.Replace('\\', '-')} - {workItem.CreatedDate} --- {workItem.ClosedDate}";
return result;
}
private static ReadOnlyCollection<string> GetChildrenDirectories(ReadOnlyDictionary<int, Record> keyValuePairs, List<bool> nests, string parentDirectory, Record record)
{
List<string> results = [];
nests.Add(true);
string directory;
Record? childRecord;
ReadOnlyCollection<string> childrenDirectories;
foreach (Record r in record.Children)
{
// if (record.WorkItem.Id == 110730)
// continue;
// if (record.WorkItem.Id == 110732)
// continue;
directory = Path.Combine(parentDirectory, $"{r.WorkItem.WorkItemType[..1]}-{r.WorkItem.Id}-{r.WorkItem.Title.Trim()[..1]}");
results.Add(directory);
if (!keyValuePairs.TryGetValue(r.WorkItem.Id, out childRecord))
continue;
if (nests.Count > 99)
break;
childrenDirectories = GetChildrenDirectories(keyValuePairs, nests, directory, childRecord);
results.AddRange(childrenDirectories);
}
return new(results);
}
private static void FilterChildren(ReadOnlyCollection<string> workItemTypes, Record record, List<WorkItem> results)
{
foreach (Record r in record.Children)
{
if (!workItemTypes.Contains(r.WorkItem.WorkItemType))
continue;
results.Add(r.WorkItem);
FilterChildren(workItemTypes, r, results);
}
}
private static ReadOnlyDictionary<int, Record> GetKeyValuePairs(ReadOnlyDictionary<int, WorkItem> keyValuePairs)
{
Dictionary<int, Record> results = [];
Record record;
List<bool> nests = [];
WorkItem? parentWorkItem;
ReadOnlyCollection<Record> records;
foreach (KeyValuePair<int, WorkItem> keyValuePair in keyValuePairs)
{
nests.Clear();
if (keyValuePair.Value.Parent is null)
parentWorkItem = null;
else
_ = keyValuePairs.TryGetValue(keyValuePair.Value.Parent.Value, out parentWorkItem);
try
{
records = GetKeyValuePairs(keyValuePairs, keyValuePair.Value, nests);
record = new(keyValuePair.Value, parentWorkItem, records);
}
catch (Exception)
{
record = new(keyValuePair.Value, parentWorkItem, new([]));
}
results.Add(keyValuePair.Key, record);
}
return new(results);
}
private static ReadOnlyCollection<string> GetDirectories(string destinationDirectory, ReadOnlyDictionary<int, Record> keyValuePairs)
{
List<string> results = [];
Record record;
string directory;
List<bool> nests = [];
ReadOnlyCollection<string> childrenDirectories;
string ticksDirectory = Path.Combine(destinationDirectory, "_", DateTime.Now.Ticks.ToString());
foreach (KeyValuePair<int, Record> keyValuePair in keyValuePairs)
{
record = keyValuePair.Value;
if (record.Parent is not null && (record.WorkItem.Parent is null || record.Parent.Id != record.WorkItem.Parent.Value))
continue;
if (record.Parent is not null)
continue;
// if (record.WorkItem.Id == 110730)
// continue;
// if (record.WorkItem.Id == 110732)
// continue;
nests.Clear();
directory = Path.Combine(ticksDirectory, $"{record.WorkItem.WorkItemType[..1]}-{record.WorkItem.Id}-{record.WorkItem.Title.Trim()[..1]}");
childrenDirectories = GetChildrenDirectories(keyValuePairs, nests, directory, record);
results.AddRange(childrenDirectories);
}
return new(results.Distinct().ToArray());
}
private static int GetState(WorkItem workItem) =>
workItem.State switch
{
"New" => 1,
"Active" => 2,
"Resolved" => 3,
"Closed" => 4,
"Removed" => 5,
_ => 8
};
private static ReadOnlyCollection<WorkItem> FilterChildren(ReadOnlyCollection<string> workItemTypes, Record record)
{
List<WorkItem> results = [];
FilterChildren(workItemTypes, record, results);
return new(results);
}
private static ReadOnlyDictionary<int, Record> GetWorkItems(ILogger<Worker> logger, string developmentURL, string productionURL)
{
ReadOnlyDictionary<int, Record> results;
Dictionary<int, WorkItem> keyValuePairs = [];
Task<HttpResponseMessage> httpResponseMessage;
HttpClient httpClient = new(new HttpClientHandler { UseCookies = false });
httpResponseMessage = httpClient.GetAsync(developmentURL);
httpResponseMessage.Wait();
if (!httpResponseMessage.Result.IsSuccessStatusCode)
logger.LogWarning("{StatusCode} for {url}", httpResponseMessage.Result.StatusCode, developmentURL);
Task<string> developmentJSON = httpResponseMessage.Result.Content.ReadAsStringAsync();
developmentJSON.Wait();
if (!string.IsNullOrEmpty(productionURL))
{
httpResponseMessage = httpClient.GetAsync(productionURL);
httpResponseMessage.Wait();
if (!httpResponseMessage.Result.IsSuccessStatusCode)
logger.LogWarning("{StatusCode} for {url}", httpResponseMessage.Result.StatusCode, productionURL);
Task<string> productionJSON = httpResponseMessage.Result.Content.ReadAsStringAsync();
productionJSON.Wait();
if (productionJSON.Result != developmentJSON.Result)
logger.LogWarning("productionJSON doesn't match developmentJSON");
}
WorkItem[]? workItems = JsonSerializer.Deserialize(developmentJSON.Result, WorkItemCollectionSourceGenerationContext.Default.WorkItemArray);
if (workItems is null)
logger.LogWarning("workItems is null");
else
{
foreach (WorkItem workItem in workItems)
keyValuePairs.Add(workItem.Id, workItem);
}
results = GetKeyValuePairs(new(keyValuePairs));
return results;
}
private static void WriteFileStructure(string destinationDirectory, ReadOnlyDictionary<int, Record> keyValuePairs)
{
ReadOnlyCollection<string> collection = GetDirectories(destinationDirectory, keyValuePairs);
foreach (string directory in collection)
{
if (directory.Length > 222)
continue;
if (!Directory.Exists(directory))
_ = Directory.CreateDirectory(directory);
}
}
private static void WriteFiles(string destinationDirectory, ReadOnlyCollection<Record> records, string fileName)
{
string json = JsonSerializer.Serialize(records.ToArray(), RecordCollectionCommonSourceGenerationContext.Default.RecordArray);
string jsonFile = Path.Combine(destinationDirectory, $"{fileName}.json");
string jsonOld = !File.Exists(jsonFile) ? string.Empty : File.ReadAllText(jsonFile);
if (json != jsonOld)
File.WriteAllText(jsonFile, json);
}
private static void WriteFiles(string destinationDirectory, ReadOnlyCollection<string> lines, ReadOnlyCollection<WorkItem> workItems, string fileName)
{
string text = string.Join(Environment.NewLine, lines);
string markdownFile = Path.Combine(destinationDirectory, $"{fileName}.md");
string textOld = !File.Exists(markdownFile) ? string.Empty : File.ReadAllText(markdownFile);
if (text != textOld)
File.WriteAllText(markdownFile, text);
string html = CommonMark.CommonMarkConverter.Convert(text).Replace("<a href", "<a target='_blank' href");
string htmlFile = Path.Combine(destinationDirectory, $"{fileName}.html");
string htmlOld = !File.Exists(htmlFile) ? string.Empty : File.ReadAllText(htmlFile);
if (html != htmlOld)
File.WriteAllText(htmlFile, html);
string json = JsonSerializer.Serialize(workItems.ToArray(), WorkItemCollectionSourceGenerationContext.Default.WorkItemArray);
string jsonFile = Path.Combine(destinationDirectory, $"{fileName}.json");
string jsonOld = !File.Exists(jsonFile) ? string.Empty : File.ReadAllText(jsonFile);
if (json != jsonOld)
File.WriteAllText(jsonFile, json);
}
private static ReadOnlyCollection<WorkItem> GetWorkItemsNotMatching122514(Record record, ReadOnlyCollection<WorkItem> workItems)
{
List<WorkItem> results = [];
string[] segments;
string[] parentTags = record.WorkItem.Tags.Split(';').Select(l => l.Trim()).ToArray();
foreach (WorkItem workItem in workItems)
{
segments = string.IsNullOrEmpty(workItem.Tags) ? [] : workItem.Tags.Split(';').Select(l => l.Trim()).ToArray();
if (segments.Length > 0 && parentTags.Any(l => segments.Contains(l)))
continue;
results.Add(workItem);
}
return new(results);
}
private static ReadOnlyCollection<WorkItem> GetWorkItemsNotMatching126169(Record record, ReadOnlyCollection<WorkItem> workItems)
{
List<WorkItem> results = [];
foreach (WorkItem workItem in workItems)
{
if (record.WorkItem.Priority is null)
{
results.Add(record.WorkItem);
break;
}
if (workItem.Priority == record.WorkItem.Priority.Value)
continue;
results.Add(workItem);
}
return new(results);
}
private static ReadOnlyCollection<WorkItem> GetWorkItemsNotMatching123066(Record record, ReadOnlyCollection<WorkItem> workItems)
{
List<WorkItem> results = [];
int check;
int state = GetState(record.WorkItem);
List<KeyValuePair<int, WorkItem>> collection = [];
foreach (WorkItem workItem in workItems)
{
if (workItem.State is "Removed")
continue;
check = GetState(workItem);
if (check == state)
continue;
collection.Add(new(check, workItem));
}
if (collection.Count > 0)
{
KeyValuePair<int, WorkItem>[] notNewState = (from l in collection where l.Value.State != "New" select l).ToArray();
if (notNewState.Length == 0 && record.WorkItem.State is "New" or "Active")
collection.Clear();
else if (notNewState.Length > 0)
{
int minimum = notNewState.Min(l => l.Key);
if (minimum == state)
collection.Clear();
else if (minimum == 1 && record.WorkItem.State == "New")
collection.Clear();
else if (notNewState.Length > 0 && record.WorkItem.State == "Active")
collection.Clear();
}
}
foreach (KeyValuePair<int, WorkItem> keyValuePair in collection.OrderByDescending(l => l.Key))
results.Add(keyValuePair.Value);
return new(results);
}
private static ReadOnlyCollection<WorkItem> GetWorkItemsNotMatching123067(Record record, ReadOnlyCollection<WorkItem> workItems)
{
List<WorkItem> results = [];
int check;
int state = GetState(record.WorkItem);
List<KeyValuePair<int, WorkItem>> collection = [];
foreach (WorkItem workItem in workItems)
{
if (record.WorkItem.State is "Removed" or "Resolved" or "Closed")
continue;
check = GetState(workItem);
if (check == state)
continue;
collection.Add(new(check, workItem));
}
foreach (KeyValuePair<int, WorkItem> keyValuePair in collection.OrderByDescending(l => l.Key))
results.Add(keyValuePair.Value);
return new(results);
}
private static string? GetMaxIterationPath122508(ReadOnlyCollection<WorkItem> workItems)
{
string? result;
List<string> results = [];
foreach (WorkItem workItem in workItems)
{
if (results.Contains(workItem.IterationPath))
continue;
results.Add(workItem.IterationPath);
}
result = results.Count == 0 ? null : results.Max();
return result;
}
private static ReadOnlyCollection<WorkItem> FeatureCheckIterationPath122508(string url, List<string> lines, ReadOnlyCollection<string> workItemTypes, ReadOnlyCollection<Record> records, string workItemType)
{
List<WorkItem> results = [];
string? maxIterationPath;
List<string> collection = [];
List<string> violations = [];
ReadOnlyCollection<WorkItem> childrenWorkItems;
foreach (Record record in records)
{
if (record.WorkItem.State is "Removed")
continue;
if (record.WorkItem.WorkItemType != workItemType)
continue;
collection.Clear();
violations.Clear();
if (record.Children.Count == 0)
continue;
childrenWorkItems = FilterChildren(workItemTypes, record);
maxIterationPath = GetMaxIterationPath122508(childrenWorkItems);
if (string.IsNullOrEmpty(maxIterationPath) || record.WorkItem.IterationPath == maxIterationPath)
continue;
collection.Add($"## {record.WorkItem.AssignedTo} - {record.WorkItem.Id} - {record.WorkItem.Title}");
collection.Add(string.Empty);
collection.Add($"- [{record.WorkItem.Id}]({url}{record.WorkItem.Id})");
collection.Add($"- [ ] {record.WorkItem.Id} => {record.WorkItem.IterationPath} != {maxIterationPath}");
collection.Add(string.Empty);
lines.AddRange(collection);
results.Add(WorkItem.Get(record.WorkItem, $"IterationPath:<a target='_blank' href=' {url}{record.WorkItem.Id} '>{record.WorkItem.Id}</a>;{record.WorkItem.IterationPath} != {maxIterationPath}"));
}
return new(results);
}
private static ReadOnlyCollection<WorkItem> FeatureCheckTag122514(string url, List<string> lines, ReadOnlyCollection<string> workItemTypes, ReadOnlyCollection<Record> records, string workItemType)
{
List<WorkItem> results = [];
List<string> collection = [];
List<string> violations = [];
ReadOnlyCollection<WorkItem> childrenWorkItems;
ReadOnlyCollection<WorkItem> workItemsNotMatching;
foreach (Record record in records)
{
if (record.WorkItem.State is "Removed")
continue;
if (record.WorkItem.WorkItemType != workItemType)
continue;
collection.Clear();
violations.Clear();
if (record.Children.Count == 0)
continue;
if (string.IsNullOrEmpty(record.WorkItem.Tags))
workItemsNotMatching = new([record.WorkItem]);
else
{
childrenWorkItems = FilterChildren(workItemTypes, record);
workItemsNotMatching = GetWorkItemsNotMatching122514(record, childrenWorkItems);
if (!string.IsNullOrEmpty(record.WorkItem.Tags) && workItemsNotMatching.Count == 0)
continue;
}
collection.Add($"## {record.WorkItem.AssignedTo} - {record.WorkItem.Id} - {record.WorkItem.Title}");
collection.Add(string.Empty);
collection.Add($"- [{record.WorkItem.Id}]({url}{record.WorkItem.Id})");
foreach (WorkItem workItem in workItemsNotMatching)
collection.Add($"- [ ] {workItem} {nameof(record.WorkItem.Tags)} != {record.WorkItem.Tags}");
collection.Add(string.Empty);
lines.AddRange(collection);
violations.Add($"Tag:{record.WorkItem.Tags};");
foreach (WorkItem workItem in workItemsNotMatching)
violations.Add($"<a target='_blank' href=' {url}{workItem.Id} '>{workItem.Id}</a>:{workItem.Tags};");
results.Add(WorkItem.Get(record.WorkItem, string.Join(' ', violations)));
}
return new(results);
}
private static ReadOnlyCollection<WorkItem> FeatureCheckPriority126169(string url, List<string> lines, ReadOnlyCollection<string> workItemTypes, ReadOnlyCollection<Record> records, string workItemType)
{
List<WorkItem> results = [];
List<string> collection = [];
List<string> violations = [];
ReadOnlyCollection<WorkItem> childrenWorkItems;
ReadOnlyCollection<WorkItem> workItemsNotMatching;
foreach (Record record in records)
{
if (record.WorkItem.State is "Removed")
continue;
if (record.WorkItem.WorkItemType != workItemType)
continue;
collection.Clear();
violations.Clear();
if (record.Children.Count == 0)
continue;
childrenWorkItems = FilterChildren(workItemTypes, record);
workItemsNotMatching = GetWorkItemsNotMatching126169(record, childrenWorkItems);
if (workItemsNotMatching.Count == 0)
continue;
collection.Add($"## {record.WorkItem.AssignedTo} - {record.WorkItem.Id} - {record.WorkItem.Title}");
collection.Add(string.Empty);
collection.Add($"- [{record.WorkItem.Id}]({url}{record.WorkItem.Id})");
foreach (WorkItem workItem in workItemsNotMatching)
collection.Add($"- [ ] [{workItem.Id}]({url}{workItem.Id}) {nameof(record.WorkItem.Priority)} != {record.WorkItem.Priority}");
collection.Add(string.Empty);
lines.AddRange(collection);
violations.Add($"Priority:{record.WorkItem.Priority};");
foreach (WorkItem workItem in workItemsNotMatching)
violations.Add($"<a target='_blank' href=' {url}{workItem.Id} '>{workItem.Id}</a>:{workItem.Priority};");
results.Add(WorkItem.Get(record.WorkItem, string.Join(' ', violations)));
}
return new(results);
}
private static ReadOnlyCollection<WorkItem> FeatureCheckState123066(string url, List<string> lines, ReadOnlyCollection<string> workItemTypes, ReadOnlyCollection<Record> records, string workItemType)
{
List<WorkItem> results = [];
List<string> collection = [];
List<string> violations = [];
ReadOnlyCollection<WorkItem> childrenWorkItems;
ReadOnlyCollection<WorkItem> workItemsNotMatching;
foreach (Record record in records)
{
if (record.WorkItem.State is "Removed")
continue;
if (record.WorkItem.WorkItemType != workItemType)
continue;
collection.Clear();
violations.Clear();
if (record.Children.Count == 0)
continue;
childrenWorkItems = FilterChildren(workItemTypes, record);
workItemsNotMatching = GetWorkItemsNotMatching123066(record, childrenWorkItems);
if (workItemsNotMatching.Count == 0)
continue;
collection.Add($"## {record.WorkItem.AssignedTo} - {record.WorkItem.Id} - {record.WorkItem.Title}");
collection.Add(string.Empty);
collection.Add($"- [{record.WorkItem.Id}]({url}{record.WorkItem.Id})");
foreach (WorkItem workItem in workItemsNotMatching)
collection.Add($"- [ ] [{workItem.Id}]({url}{workItem.Id}) {nameof(record.WorkItem.State)} != {record.WorkItem.State}");
collection.Add(string.Empty);
lines.AddRange(collection);
violations.Add($"State:{record.WorkItem.State};");
foreach (WorkItem workItem in workItemsNotMatching)
violations.Add($"<a target='_blank' href=' {url}{workItem.Id} '>{workItem.Id}</a>:{workItem.State};");
results.Add(WorkItem.Get(record.WorkItem, string.Join(' ', violations)));
}
return new(results);
}
private static ReadOnlyCollection<WorkItem> FeatureCheckState123067(string url, List<string> lines, ReadOnlyCollection<string> workItemTypes, ReadOnlyCollection<Record> records, string workItemType)
{
List<WorkItem> results = [];
List<string> collection = [];
List<string> violations = [];
ReadOnlyCollection<WorkItem> childrenWorkItems;
ReadOnlyCollection<WorkItem> workItemsNotMatching;
foreach (Record record in records)
{
if (record.WorkItem.State is "Removed" or "New" or "Active")
continue;
if (record.WorkItem.WorkItemType != workItemType)
continue;
collection.Clear();
violations.Clear();
if (record.Children.Count == 0)
continue;
childrenWorkItems = FilterChildren(workItemTypes, record);
workItemsNotMatching = GetWorkItemsNotMatching123067(record, childrenWorkItems);
if (workItemsNotMatching.Count == 0)
continue;
collection.Add($"## {record.WorkItem.AssignedTo} - {record.WorkItem.Id} - {record.WorkItem.Title}");
collection.Add(string.Empty);
collection.Add($"- [{record.WorkItem.Id}]({url}{record.WorkItem.Id})");
foreach (WorkItem workItem in workItemsNotMatching)
collection.Add($"- [ ] [{workItem.Id}]({url}{workItem.Id}) {nameof(record.WorkItem.State)} != {record.WorkItem.State}");
collection.Add(string.Empty);
lines.AddRange(collection);
violations.Add($"State:{record.WorkItem.State};");
foreach (WorkItem workItem in workItemsNotMatching)
violations.Add($"<a target='_blank' href=' {url}{workItem.Id} '>{workItem.Id}</a>:{workItem.State};");
results.Add(WorkItem.Get(record.WorkItem, string.Join(' ', violations)));
}
return new(results);
}
internal static void WriteMarkdown(ILogger<Worker> logger, List<string> args)
{
string url = args[5];
List<char> spaces = [];
List<string> lines = [];
ReadOnlyCollection<WorkItem> results;
string[] workItemTypes = args[4].Split('~');
string sourceDirectory = Path.GetFullPath(args[0]);
string destinationDirectory = Path.GetFullPath(args[6]);
if (!Directory.Exists(destinationDirectory))
_ = Directory.CreateDirectory(destinationDirectory);
ReadOnlyDictionary<int, Record> keyValuePairs = GetWorkItems(logger, args[2], args[3]);
WriteFileStructure(destinationDirectory, keyValuePairs);
ReadOnlyCollection<Record> records = new(keyValuePairs.Values.ToArray());
ReadOnlyCollection<string> bugUserStoryWorkItemTypes = new(new string[] { "Bug", "User Story" });
ReadOnlyCollection<string> bugUserStoryTaskWorkItemTypes = new(new string[] { "Bug", "User Story", "Task" });
WriteFiles(destinationDirectory, records, "with-parents");
foreach (string workItemType in workItemTypes)
{
lines.Clear();
lines.Add($"# {workItemType}");
lines.Add(string.Empty);
AppendLines(url, spaces, lines, records, workItemType);
results = new([]);
WriteFiles(destinationDirectory, new(lines), results, workItemType);
}
{
lines.Clear();
string workItemType = "Feature";
lines.Add($"# {nameof(FeatureCheckIterationPath122508)}");
lines.Add(string.Empty);
results = FeatureCheckIterationPath122508(url, lines, bugUserStoryTaskWorkItemTypes, records, workItemType);
WriteFiles(destinationDirectory, new(lines), results, "check-122508");
}
{
lines.Clear();
string workItemType = "Feature";
lines.Add($"# {nameof(FeatureCheckTag122514)}");
lines.Add(string.Empty);
results = FeatureCheckTag122514(url, lines, bugUserStoryWorkItemTypes, records, workItemType);
WriteFiles(destinationDirectory, new(lines), results, "check-122514");
}
{
lines.Clear();
string workItemType = "Feature";
lines.Add($"# {nameof(FeatureCheckPriority126169)}");
lines.Add(string.Empty);
results = FeatureCheckPriority126169(url, lines, bugUserStoryWorkItemTypes, records, workItemType);
WriteFiles(destinationDirectory, new(lines), results, "check-126169");
}
{
lines.Clear();
string workItemType = "Feature";
lines.Add($"# {nameof(FeatureCheckState123066)}");
lines.Add(string.Empty);
results = FeatureCheckState123066(url, lines, bugUserStoryTaskWorkItemTypes, records, workItemType);
WriteFiles(destinationDirectory, new(lines), results, "check-123066");
}
{
lines.Clear();
string workItemType = "Feature";
lines.Add($"# {nameof(FeatureCheckState123067)}");
lines.Add(string.Empty);
results = FeatureCheckState123067(url, lines, bugUserStoryTaskWorkItemTypes, records, workItemType);
WriteFiles(destinationDirectory, new(lines), results, "check-123067");
}
}
#else
internal static void WriteMarkdown(ILogger<Worker> logger, List<string> args)
{
logger.LogError("WriteMarkdown is not available in CommonMark {args[0]}", args[0]);
logger.LogError("WriteMarkdown is not available in CommonMark {args[1]}", args[1]);
}
#endif
}

View File

@ -0,0 +1,47 @@
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2024.PI3;
internal static partial class Helper20240916
{
internal static void DebugProxyPass(ILogger<Worker> logger, List<string> args)
{
string debug;
string domain;
string[] lines;
string fileName;
string[] segments;
string includePath = args[3];
string searchString = args[4];
string searchPattern = args[2];
string[] searchStrings = args[5].Split('~');
string sourceDirectory = Path.GetFullPath(args[0]);
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.AllDirectories);
logger.LogInformation("With search pattern '{SearchPattern}' found {files} file(s)", searchPattern, files.Length);
foreach (string file in files)
{
debug = string.Empty;
domain = string.Empty;
lines = File.ReadAllLines(file);
fileName = Path.GetFileName(file);
foreach (string line in lines)
{
segments = line.Split(searchString, StringSplitOptions.None);
if (segments.Length > 1 && segments[1][0] is ' ' or '\t')
domain = segments[1].Trim().Trim(';');
segments = line.Split(searchStrings, StringSplitOptions.None);
if (segments.Length < 2)
continue;
if (segments[1][0] is not ' ' and not '\t')
continue;
debug = segments[1].Trim().Trim(';');
}
logger.LogInformation("include {includePath}{fileName}; # https://{domain} # {debug}",
includePath,
fileName,
domain,
debug);
}
}
}

View File

@ -0,0 +1,65 @@
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
using System.Text.Json;
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3;
internal static partial class Helper20240925
{
private record Test(string Name,
long Value);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(ReadOnlyCollection<Test>))]
private partial class TestCollectionSourceGenerationContext : JsonSerializerContext
{
}
private static ReadOnlyCollection<Test> GetTests(string sourceDirectory, string searchPattern, string searchPatternB)
{
List<Test> results = [];
long test;
string[] lines;
string[] segments;
string[] segmentsB;
List<long> distinct = [];
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.AllDirectories);
foreach (string file in files)
{
lines = File.ReadAllLines(file);
foreach (string line in lines)
{
segments = line.Split(searchPatternB);
if (segments.Length < 2)
continue;
segmentsB = segments[1].Split(',');
if (segmentsB.Length < 2)
continue;
if (!long.TryParse(segmentsB[0], out test))
continue;
if (distinct.Contains(test))
continue;
distinct.Add(test);
results.Add(new(segmentsB[1].Trim('"'), test));
}
}
return (from l in results orderby l.Name.Length, l.Name select l).ToArray().AsReadOnly();
}
internal static void DistinctTests(ILogger<Worker> logger, List<string> args)
{
string searchPattern = args[2];
string searchPatternB = args[3];
string destinationDirectory = args[4];
string sourceDirectory = Path.GetFullPath(args[0]);
if (!Directory.Exists(destinationDirectory))
_ = Directory.CreateDirectory(destinationDirectory);
ReadOnlyCollection<Test> tests = GetTests(sourceDirectory, searchPattern, searchPatternB);
logger.LogInformation("Found {files} file(s)", tests.Count);
string json = JsonSerializer.Serialize(tests, TestCollectionSourceGenerationContext.Default.ReadOnlyCollectionTest);
string fileName = Path.Combine(destinationDirectory, ".json");
File.WriteAllText(fileName, json);
}
}

View File

@ -0,0 +1,177 @@
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
using System.Text.Json;
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3;
internal static partial class Helper20241002
{
private record Record(string? Calculation,
string Chart,
string Group,
string GroupId,
long Id,
string? RawCalculation,
string Test,
string TestId)
{
internal static Record Get(Record record, string? calculation) =>
new(calculation,
record.Chart,
record.Group,
record.GroupId,
record.Id,
record.RawCalculation,
record.Test,
record.TestId);
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(ReadOnlyCollection<Record>))]
private partial class RecordCollectionSourceGenerationContext : JsonSerializerContext
{
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(ReadOnlyDictionary<string, Record>))]
private partial class RecordDictionarySourceGenerationContext : JsonSerializerContext
{
}
private static string? GetCalculation(string searchPatternC, string[] lines, int i, string id, long idValue)
{
string? result = null;
string line;
long check;
string[] segments;
string[] segmentsB;
for (int j = i + 1; j < lines.Length; j++)
{
line = lines[j];
if (!line.Contains(id))
break;
segments = line.Split(searchPatternC);
if (segments.Length < 2)
continue;
segmentsB = segments[1].Split('=');
if (segmentsB.Length < 2)
continue;
if (!long.TryParse(segmentsB[0], out check))
continue;
if (check != idValue)
break;
result = segmentsB[1];
}
return result;
}
private static ReadOnlyDictionary<string, Record> GetKeyValuePairs(ReadOnlyDictionary<string, Record> keyValuePairs)
{
Dictionary<string, Record> results = [];
Record result;
Record record;
string? calculation;
foreach (KeyValuePair<string, Record> keyValuePair in keyValuePairs)
{
record = keyValuePair.Value;
calculation = record.RawCalculation;
if (calculation is not null)
{
foreach (KeyValuePair<string, Record> kVP in keyValuePairs)
calculation = calculation.Replace(kVP.Key, $"%DCS(Value, {kVP.Value.Test})");
}
result = Record.Get(record, calculation);
results.Add(keyValuePair.Key, result);
}
return new(results);
}
private static string GetKey(Record record) =>
$"ch({record.Id + 1})";
private static ReadOnlyCollection<Record> GetRecords(string sourceDirectory, string searchPattern, string searchPatternB, string searchPatternC)
{
List<Record> results = [];
string id;
string line;
long idValue;
string[] lines;
string[] segments;
string[] segmentsB;
string[] segmentsC;
string? calculation;
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.AllDirectories);
foreach (string file in files)
{
lines = File.ReadAllLines(file);
for (int i = 0; i < lines.Length; i++)
{
line = lines[i];
segments = line.Split(searchPatternB);
if (segments.Length < 2)
continue;
segmentsB = segments[1].Split('=');
if (segmentsB.Length < 2)
continue;
if (!long.TryParse(segmentsB[0], out idValue))
continue;
id = segmentsB[0];
segmentsC = segments[1].Split(',');
if (segmentsC.Length < 4)
continue;
calculation = GetCalculation(searchPatternC, lines, i, id, idValue);
results.Add(new(null,
Path.GetFileName(file),
segmentsC[2].Trim('"'),
segmentsC[1],
idValue,
calculation,
segmentsC[4].Trim('"'),
segmentsC[3]));
}
}
return new(results);
}
private static ReadOnlyDictionary<string, Record> GetKeyValuePairs(ReadOnlyCollection<Record> records)
{
Dictionary<string, Record> results = [];
string key;
string? last = null;
foreach (Record record in records)
{
if (last is not null && record.Chart != last)
continue;
last = record.Chart;
key = GetKey(record);
if (results.ContainsKey(key))
continue;
results.Add(key, record);
}
return new(results);
}
internal static void ConvertInfinityQSProjectFiles(ILogger<Worker> logger, List<string> args)
{
string searchPattern = args[2];
string searchPatternB = args[3];
string searchPatternC = args[4];
string destinationDirectory = args[5];
string sourceDirectory = Path.GetFullPath(args[0]);
if (!Directory.Exists(destinationDirectory))
_ = Directory.CreateDirectory(destinationDirectory);
ReadOnlyCollection<Record> records = GetRecords(sourceDirectory, searchPattern, searchPatternB, searchPatternC);
logger.LogInformation("Found {records} records(s)", records.Count);
ReadOnlyDictionary<string, Record> collection = GetKeyValuePairs(records);
logger.LogInformation("Found {collection} collection(s)", collection.Count);
ReadOnlyDictionary<string, Record> keyValuePairs = GetKeyValuePairs(collection);
logger.LogInformation("Found {keyValuePairs}", keyValuePairs.Count);
string json = JsonSerializer.Serialize(keyValuePairs, RecordDictionarySourceGenerationContext.Default.ReadOnlyDictionaryStringRecord);
string fileName = Path.Combine(destinationDirectory, ".json");
File.WriteAllText(fileName, json);
}
}

View File

@ -0,0 +1,47 @@
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
namespace File_Folder_Helper.ADO2024.PI3;
internal static partial class Helper20241029
{
private static ReadOnlyCollection<string> GetFibonacci(int length)
{
List<string> results = [];
int[] fibonacci =
[
1, // x-small
2, // small
3, // medium
5, // large
8, // x-large
13, // xx-large
20 // xxx-large
];
double factor = Math.Floor((double)(length / fibonacci.Length));
double sort = Math.Round(1 / factor, 6) - 0.001;
for (int j = 0; j < fibonacci.Length; j++)
{
results.Add((fibonacci[j] * 1000).ToString());
for (int i = 0; i < factor; i++)
results.Add(((int)Math.Round((fibonacci[j] + ((i + 1) * sort)) * 1000)).ToString());
}
if (results.Count < length)
throw new Exception();
results.Reverse();
return new(results);
}
internal static void GetFibonacci(ILogger<Worker> logger, List<string> args)
{
int length = int.Parse(args[2]);
ReadOnlyCollection<string> collection;
for (int i = 1; i < 200; i++)
_ = GetFibonacci(i);
collection = GetFibonacci(length);
foreach (string fibonacci in collection)
logger.LogInformation(fibonacci);
File.WriteAllText(".vscode/helper/.txt", string.Join(Environment.NewLine, collection));
}
}

View File

@ -0,0 +1,433 @@
using Microsoft.Extensions.Logging;
#if HgCV
using System.Collections.ObjectModel;
using System.Text.Json;
using System.Text.Json.Serialization;
#endif
namespace File_Folder_Helper.ADO2024.PI3;
internal static partial class Helper20241030
{
#if HgCV
private record Complete(Header Header, Summary Summary, ReadOnlyCollection<Point> Points)
{
internal static Complete? Get(int take, string site, string multiple, string summaryLine, string lastUnits, string lastUnitsB, ReadOnlyCollection<string> lines)
{
Complete? result;
Header? header = Header.Get(lines, site, summaryLine);
if (header is null)
result = null;
else
{
Summary? summary = SummarySegment.Get(lines, site, summaryLine, lastUnits);
if (summary is null)
result = null;
else
{
ReadOnlyCollection<Point> points = Point.GetCollection(lines, take, site, multiple, summaryLine, lastUnitsB) ?? throw new NullReferenceException(nameof(summary));
if (points.Count == 0)
result = null;
else
result = new(header, summary, points);
}
}
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Complete))]
private partial class CompleteSourceGenerationContext : JsonSerializerContext
{
}
private record Header([property: JsonPropertyName("Operator")] string Operator,
[property: JsonPropertyName("Start Voltage")] string StartVoltage,
[property: JsonPropertyName("Wafer")] string Wafer,
[property: JsonPropertyName("Stop Voltage")] string StopVoltage,
[property: JsonPropertyName("Lot")] string Lot,
[property: JsonPropertyName("Ramp Rate")] string RampRate,
[property: JsonPropertyName("Plan")] string Plan,
[property: JsonPropertyName("G limit")] string GLimit,
[property: JsonPropertyName("Date")] string Date,
[property: JsonPropertyName("Time")] string Time,
[property: JsonPropertyName("Setup File")] string SetupFile,
[property: JsonPropertyName("Wafer size")] string WaferSize,
[property: JsonPropertyName("Folder")] string Folder,
[property: JsonPropertyName("Ccomp")] string Ccomp,
[property: JsonPropertyName("Pattern")] string Pattern,
[property: JsonPropertyName("Area")] string Area,
[property: JsonPropertyName("Cond Type")] string CondType,
[property: JsonPropertyName("Rho Method")] string RhoMethod,
[property: JsonPropertyName("Model")] string Model)
{
private static string[] GetRemove() =>
[
" L L",
" O O",
" G G",
" C C",
" O O",
" N N",
" C C",
" E E",
" N N",
" T T",
" R R",
" A A",
" T T",
" I I",
" O O",
" N N"
];
internal static Header Get() =>
new(string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty);
private static ReadOnlyCollection<JsonProperty> GetJsonProperties()
{
JsonProperty[] results;
string json;
Header header = Get();
json = JsonSerializer.Serialize(header);
JsonElement jsonElement = JsonSerializer.Deserialize<JsonElement>(json);
results = jsonElement.EnumerateObject().ToArray();
return new(results);
}
internal static Header? Get(ReadOnlyCollection<string> lines, string site, string summaryLine)
{
Header? result;
string json;
string check;
string[] segments;
string[] segmentsB;
string[] segmentsC;
bool found = false;
string[] remove = GetRemove();
Dictionary<string, string> keyValuePairs = [];
ReadOnlyCollection<JsonProperty> jsonProperties = GetJsonProperties();
foreach (string line in lines)
{
if (line.Contains(site))
found = true;
if (!found)
continue;
if (line == summaryLine)
break;
foreach (JsonProperty jsonProperty in jsonProperties)
{
segments = line.Split([$"{jsonProperty.Name}:", $"{jsonProperty.Name} :"], StringSplitOptions.None);
if (segments.Length < 2)
continue;
check = segments[1].Trim();
foreach (JsonProperty jsonPropertyB in jsonProperties)
{
segmentsB = check.Split([$"{jsonPropertyB.Name}:", $"{jsonPropertyB.Name} :"], StringSplitOptions.None);
if (segmentsB.Length > 1)
check = segmentsB[0].Trim();
}
foreach (string r in remove)
{
segmentsC = check.Split(r);
if (segmentsC.Length > 1)
check = segmentsC[0].Trim();
}
keyValuePairs.Add(jsonProperty.Name, check);
}
}
if (keyValuePairs.Count != jsonProperties.Count)
result = null;
else
{
json = JsonSerializer.Serialize(keyValuePairs);
result = JsonSerializer.Deserialize(json, HeaderSourceGenerationContext.Default.Header) ?? throw new NullReferenceException(nameof(result));
}
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Header))]
private partial class HeaderSourceGenerationContext : JsonSerializerContext
{
}
private record Summary(SummarySegment? Mean, SummarySegment? StandardDeviationPercentage, SummarySegment? RadialGradient);
private record SummarySegment([property: JsonPropertyName("Navg")] string NAvg,
[property: JsonPropertyName("Nsl")] string Nsl,
[property: JsonPropertyName("Vd")] string Vd,
[property: JsonPropertyName("Flat Z")] string FlatZ,
[property: JsonPropertyName("Rhoavg")] string RhoAvg,
[property: JsonPropertyName("Rhosl")] string Rhosl,
[property: JsonPropertyName("Phase")] string Phase,
[property: JsonPropertyName("Grade")] string Grade,
[property: JsonPropertyName("@ Rs")] string Rs)
{
internal static SummarySegment Get() =>
new(string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty);
private static ReadOnlyCollection<JsonProperty> GetJsonProperties()
{
JsonProperty[] results;
string json;
SummarySegment summarySegment = Get();
json = JsonSerializer.Serialize(summarySegment);
JsonElement jsonElement = JsonSerializer.Deserialize<JsonElement>(json);
results = jsonElement.EnumerateObject().ToArray();
return new(results);
}
internal static Summary? Get(ReadOnlyCollection<string> lines, string site, string summaryLine, string lastUnits)
{
Summary? result;
string json;
string[] segments;
bool found = false;
string[] segmentsB;
Dictionary<string, string> keyValuePairs = [];
Dictionary<string, string> keyValuePairsB = [];
Dictionary<string, string> keyValuePairsC = [];
ReadOnlyCollection<JsonProperty> jsonProperties = GetJsonProperties();
foreach (string line in lines)
{
if (line == summaryLine)
found = true;
if (!found)
continue;
if (line.Contains(site))
break;
if (line.Contains(lastUnits))
break;
foreach (JsonProperty jsonProperty in jsonProperties)
{
segments = line.Split([$"{jsonProperty.Name}:", $"{jsonProperty.Name} :"], StringSplitOptions.None);
if (segments.Length < 2 || (!line.StartsWith(jsonProperty.Name) && !line.StartsWith($"@ {jsonProperty.Name}")))
continue;
segmentsB = segments[1].Trim().Split(' ');
if (segmentsB.Length < 3)
continue;
keyValuePairs.Add(jsonProperty.Name, segmentsB[0]);
keyValuePairsB.Add(jsonProperty.Name, segmentsB[1]);
keyValuePairsC.Add(jsonProperty.Name, segmentsB[2]);
}
}
if (keyValuePairs.Count != jsonProperties.Count || keyValuePairsB.Count != jsonProperties.Count || keyValuePairsC.Count != jsonProperties.Count)
result = null;
else
{
json = JsonSerializer.Serialize(keyValuePairs);
SummarySegment? mean = JsonSerializer.Deserialize(json, SummarySegmentSourceGenerationContext.Default.SummarySegment);
json = JsonSerializer.Serialize(keyValuePairsB);
SummarySegment? standardDeviationPercentage = JsonSerializer.Deserialize(json, SummarySegmentSourceGenerationContext.Default.SummarySegment);
json = JsonSerializer.Serialize(keyValuePairsC);
SummarySegment? radialGradient = JsonSerializer.Deserialize(json, SummarySegmentSourceGenerationContext.Default.SummarySegment);
result = new(mean, standardDeviationPercentage, radialGradient);
}
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(SummarySegment))]
private partial class SummarySegmentSourceGenerationContext : JsonSerializerContext
{
}
private record Point([property: JsonPropertyName("Site")] string Site,
[property: JsonPropertyName("X")] string X,
[property: JsonPropertyName("Y")] string Y,
[property: JsonPropertyName("Navg")] string NAvg,
[property: JsonPropertyName("Rhoavg")] string RhoAvg,
[property: JsonPropertyName("Nsl")] string Nsl,
[property: JsonPropertyName("Rhosl")] string Rhosl,
[property: JsonPropertyName("Vd")] string Vd,
[property: JsonPropertyName("Phase")] string Phase,
[property: JsonPropertyName("Flat Z")] string FlatZ,
[property: JsonPropertyName("Grade")] string Grade,
[property: JsonPropertyName("X Left")] string XLeft,
[property: JsonPropertyName("X Right")] string XRight,
[property: JsonPropertyName("Bottom Y")] string BottomY,
[property: JsonPropertyName("Top Y")] string TopY)
{
internal static Point Get() =>
new(string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty,
string.Empty);
internal static ReadOnlyCollection<Point> GetCollection(ReadOnlyCollection<string> lines, int take, string site, string multiple, string summaryLine, string lastUnitsB)
{
List<Point> results = [];
string s;
string line;
Point point;
string[] segments;
string[] segmentsB;
bool found = false;
bool foundB = false;
string[] segmentsC;
List<string> sites = [];
for (int i = 0; i < lines.Count; i++)
{
line = lines[i];
segmentsC = line.Split(site, StringSplitOptions.RemoveEmptyEntries);
if (segmentsC.Length > 1)
{
foreach (string segment in segmentsC)
sites.Add(segment.Trim());
}
if (line == summaryLine)
{
sites.RemoveAt(0);
found = true;
}
if (!found)
continue;
if (!foundB && line.Contains(multiple))
foundB = true;
if (line != lastUnitsB)
continue;
if (foundB)
{
foundB = false;
continue;
}
for (int j = 0; j < sites.Count; j++)
{
s = sites[j];
if (i + take > lines.Count)
break;
segments = s.Split(["(", ",", ")"], StringSplitOptions.None);
if (segments.Length < 2)
break;
segmentsB = lines[i + 10].Split(' ');
if (segmentsB.Length < 2)
break;
point = new(segments[0].Trim(),
segments[1].Trim(),
segments[2].Trim(),
NAvg: lines[i + 2].Trim(),
Nsl: lines[i + 3].Trim(),
Vd: lines[i + 4].Trim(),
FlatZ: lines[i + 5].Trim(),
RhoAvg: lines[i + 6].Trim(),
Rhosl: lines[i + 7].Trim(),
Phase: lines[i + 8].Trim(),
Grade: lines[i + 9].Trim(),
XLeft: segmentsB[0],
XRight: segmentsB[1],
BottomY: lines[i + 11].Trim(),
TopY: lines[i + 12].Trim());
results.Add(point);
i += take;
}
sites.Clear();
}
return new(results);
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Point))]
private partial class PointSourceGenerationContext : JsonSerializerContext
{
}
internal static void GetComplete(ILogger<Worker> logger, List<string> args)
{
string searchPattern = args[2];
string sourceDirectory = Path.GetFullPath(args[0]);
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.TopDirectoryOnly);
if (files.Length != 1)
logger.LogError("No files found in {sourceDirectory} with search pattern {searchPattern}", sourceDirectory, searchPattern);
else
{
const int take = 12;
string[] lines = File.ReadAllLines(files[0]);
ReadOnlyCollection<string> collection = new(lines);
if (collection.Count < take)
logger.LogError("File {files[0]} has less than {take} lines", files[0], take);
else
{
const string site = "Site: ";
const string multiple = "MULTIPLE";
const string summaryLine = "SUMMARY A A";
const string lastUnits = "Flat Z: Grade : % Flat Z: Grade : %";
const string lastUnitsB = "Flat Z: Grade : % Flat Z: Grade : % Flat Z: Grade : %";
Complete? complete = Complete.Get(take, site, multiple, summaryLine, lastUnits, lastUnitsB, collection);
if (complete is null)
logger.LogError("Could not get Complete from {files[0]}", files[0]);
else
{
string json = JsonSerializer.Serialize(complete, CompleteSourceGenerationContext.Default.Complete);
File.WriteAllText($"{files[0]}.json", json);
}
}
}
}
#else
internal static void GetComplete(ILogger<Worker> logger, List<string> args)
{
string searchPattern = args[2];
string sourceDirectory = Path.GetFullPath(args[0]);
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.TopDirectoryOnly);
if (files.Length != 1)
logger.LogError("No files found in {sourceDirectory} with search pattern {searchPattern}", sourceDirectory, searchPattern);
logger.LogError("GetComplete is not available in HgCV {args[1]}", args[1]);
}
#endif
}

View File

@ -0,0 +1,559 @@
using Microsoft.Extensions.Logging;
#if CDE
using System.Collections.ObjectModel;
using System.Text.Json;
using System.Text.Json.Serialization;
#endif
namespace File_Folder_Helper.ADO2024.PI3;
internal static partial class Helper20241031
{
#if CDE
private record Complete(Line1 Line1, Line2 Line2, Line3 Line3, Line4 Line4, Line4B Line4B, Line5 Line5, Line6 Line6, Line7 Line7, Line8 Line8, Line9 Line9, Line10 Line10, Line11 Line11, Line12 Line12, Line13 Line13, Point[] Points)
{
internal static ReadOnlyCollection<string> GetCollection(string[] segments)
{
List<string> results = [];
foreach (string segment in segments)
{
if (segment[0] == ',')
break;
results.Add(segment);
}
return new(results);
}
internal static Complete? Get(int take, ReadOnlyCollection<string> lines)
{
Complete? result;
if (lines.Count < take)
result = null;
else
{
string[] separator = [" ", "\t"];
// <Title>
Line1 line1 = Line1.Get(lines[0].Split(separator, StringSplitOptions.RemoveEmptyEntries));
// <FileName, Proj,Rcpe, LotID,WfrID, Is_TF_DataFile>
Line2 line2 = Line2.Get(lines[1].Split(separator, StringSplitOptions.RemoveEmptyEntries));
// <Directory>
Line3 line3 = Line3.Get(lines[2].Split(separator, StringSplitOptions.RemoveEmptyEntries));
// <DateTime,Temp,TCR%,N|P>
Line4 line4 = Line4.Get(lines[3].Split(separator, StringSplitOptions.RemoveEmptyEntries));
// <Avg,Dev,Min,Max>
Line4B? line4B = Line4B.Get(lines[3].Split([">"], StringSplitOptions.RemoveEmptyEntries));
if (line4B is null)
result = null;
else
{
// <Operator, Epuipment>
Line5 line5 = Line5.Get(lines[4].Split(separator, StringSplitOptions.RemoveEmptyEntries));
// <Engineer>
Line6 line6 = Line6.Get(lines[5].Split(separator, StringSplitOptions.RemoveEmptyEntries));
// <AreaOrDiamScan, WaferShape, dNBand, TemplateFile, xsize,ysize, CalibFactor, MsmtMode, DataType, DataUnit>
Line7 line7 = Line7.Get(lines[6].Split(separator, StringSplitOptions.RemoveEmptyEntries));
// <NumProbePoints, SingleOrDualProbeConfig, #ActPrbPts, Rsens,IdrvMx,VinGain, DataRejectSigma, MeritThreshold, PrbChg#, PrbName>
Line8 line8 = Line8.Get(lines[7].Split(separator, StringSplitOptions.RemoveEmptyEntries));
// <WaferSize,EdgeEx, x,yll, x,yur, #x,y, CutCorners>
Line9 line9 = Line9.Get(lines[8].Split(separator, StringSplitOptions.RemoveEmptyEntries));
// <Diam: ThScan Start End Step>
Line10 line10 = Line10.Get(lines[9].Split(separator, StringSplitOptions.RemoveEmptyEntries));
// <FlatOrNotch FollowMajorFlat AutoOrManualLoad RangeOrIndvdual PauseAfterEveryRun, AutoPrint,Plot, BulkSmplThk & Unit>
Line11 line11 = Line11.Get(lines[10].Split(separator, StringSplitOptions.RemoveEmptyEntries));
// <RangeFrom, RangeTo>
Line12 line12 = Line12.Get(lines[11].Split(separator, StringSplitOptions.RemoveEmptyEntries));
// <CassSlotSelected>
Line13 line13 = Line13.Get(lines[12].Split(separator, StringSplitOptions.RemoveEmptyEntries));
// <R,Th,Data, Rs,RsA,RsB, #Smpl, x,y, Irng,Vrng, ChiSq,merit/GOF, DataIntegrity>
ReadOnlyCollection<Point> points = Point.Get(take, lines, separator);
if (points.Count == 0)
result = null;
else
result = new(line1, line2, line3, line4, line4B, line5, line6, line7, line8, line9, line10, line11, line12, line13, points.ToArray());
}
}
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Complete))]
private partial class CompleteSourceGenerationContext : JsonSerializerContext
{
}
private record Line1([property: JsonPropertyName("Title")] string Title)
{
internal static Line1 Get(string[] segments)
{
Line1 result;
ReadOnlyCollection<string> collection = Complete.GetCollection(segments);
result = new(collection.Count < 1 ? string.Empty : collection[0]);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Line1))]
private partial class Line1SourceGenerationContext : JsonSerializerContext
{
}
private record Line2([property: JsonPropertyName("FileName")] string FileName,
[property: JsonPropertyName("Proj")] string Project,
[property: JsonPropertyName("Rcpe")] string RecipeName,
[property: JsonPropertyName("LotID")] string LotID,
[property: JsonPropertyName("WfrID")] string WfrID,
[property: JsonPropertyName("Is_TF_DataFile")] string Is_TF_DataFile)
{
internal static Line2 Get(string[] segments)
{
Line2 result;
ReadOnlyCollection<string> collection = Complete.GetCollection(segments);
result = new(collection.Count < 1 ? string.Empty : collection[0],
collection.Count < 2 ? string.Empty : collection[1],
collection.Count < 3 ? string.Empty : collection[2],
collection.Count < 4 ? string.Empty : collection[3],
collection.Count < 5 ? string.Empty : collection[4],
collection.Count < 6 ? string.Empty : collection[5]);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Line2))]
private partial class Line2SourceGenerationContext : JsonSerializerContext
{
}
private record Line3([property: JsonPropertyName("Directory")] string Directory)
{
internal static Line3 Get(string[] segments)
{
Line3 result;
ReadOnlyCollection<string> collection = Complete.GetCollection(segments);
result = new(collection.Count < 1 ? string.Empty : collection[0]);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Line3))]
private partial class Line3SourceGenerationContext : JsonSerializerContext
{
}
private record Line4([property: JsonPropertyName("Time")] string Time,
[property: JsonPropertyName("Date")] string Date,
[property: JsonPropertyName("Temp")] string Temp,
[property: JsonPropertyName("TCR%")] string TCRPercent,
[property: JsonPropertyName("N|P")] string NOrP)
{
internal static Line4 Get(string[] segments)
{
Line4 result;
ReadOnlyCollection<string> collection = Complete.GetCollection(segments);
result = new(collection.Count < 1 ? string.Empty : collection[0],
collection.Count < 2 ? string.Empty : collection[1],
collection.Count < 3 ? string.Empty : collection[2],
collection.Count < 4 ? string.Empty : collection[3],
collection.Count < 5 ? string.Empty : collection[4]);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Line4))]
private partial class Line4SourceGenerationContext : JsonSerializerContext
{
}
private record Line4B([property: JsonPropertyName("Avg")] string Avg,
[property: JsonPropertyName("Dev")] string Dev,
[property: JsonPropertyName("Min")] string Min,
[property: JsonPropertyName("Max")] string Max)
{
internal static Line4B? Get(string[] segments)
{
Line4B? result;
if (segments.Length < 2)
result = null;
else
{
string[] segmentsB = segments[1].Split([" "], StringSplitOptions.RemoveEmptyEntries);
result = new(segmentsB.Length < 2 ? string.Empty : segmentsB[1],
segmentsB.Length < 4 ? string.Empty : segmentsB[3],
segmentsB.Length < 6 ? string.Empty : segmentsB[5],
segmentsB.Length < 8 ? string.Empty : segmentsB[7]);
}
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Line4B))]
private partial class Line4BSourceGenerationContext : JsonSerializerContext
{
}
private record Line5([property: JsonPropertyName("Operator")] string Operator,
[property: JsonPropertyName("Epuipment")] string Equipment)
{
internal static Line5 Get(string[] segments)
{
Line5 result;
ReadOnlyCollection<string> collection = Complete.GetCollection(segments);
result = new(collection.Count < 1 ? string.Empty : collection[0],
collection.Count < 2 ? string.Empty : collection[1]);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Line5))]
private partial class Line5SourceGenerationContext : JsonSerializerContext
{
}
private record Line6([property: JsonPropertyName("Engineer")] string Engineer)
{
internal static Line6 Get(string[] segments)
{
Line6 result;
ReadOnlyCollection<string> collection = Complete.GetCollection(segments);
result = new(collection.Count < 1 ? string.Empty : collection[0]);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Line6))]
private partial class Line6SourceGenerationContext : JsonSerializerContext
{
}
private record Line7([property: JsonPropertyName("AreaOrDiamScan")] string AreaOrDiamScan,
[property: JsonPropertyName("WaferShape")] string WaferShape,
[property: JsonPropertyName("dNBand")] string BNBand,
[property: JsonPropertyName("TemplateFile")] string TemplateFile,
[property: JsonPropertyName("xsize")] string XSize,
[property: JsonPropertyName("ysize")] string YSize,
[property: JsonPropertyName("CalibFactor")] string CalibrationFactor,
[property: JsonPropertyName("MsmtMode")] string MsmtMode,
[property: JsonPropertyName("DataType")] string DataType,
[property: JsonPropertyName("DataUnit")] string DataUnit)
{
internal static Line7 Get(string[] segments)
{
Line7 result;
ReadOnlyCollection<string> collection = Complete.GetCollection(segments);
result = new(collection.Count < 1 ? string.Empty : collection[0],
collection.Count < 2 ? string.Empty : collection[1],
collection.Count < 3 ? string.Empty : collection[2],
collection.Count < 4 ? string.Empty : collection[3],
collection.Count < 5 ? string.Empty : collection[4],
collection.Count < 6 ? string.Empty : collection[5],
collection.Count < 7 ? string.Empty : collection[6],
collection.Count < 8 ? string.Empty : collection[7],
collection.Count < 9 ? string.Empty : collection[8],
collection.Count < 10 ? string.Empty : collection[9]);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Line7))]
private partial class Line7SourceGenerationContext : JsonSerializerContext
{
}
private record Line8([property: JsonPropertyName("NumProbePoints")] string NumProbePoints,
[property: JsonPropertyName("SingleOrDualProbeConfig")] string SingleOrDualProbeConfig,
[property: JsonPropertyName("#ActPrbPts")] string NumberActPrbPts,
[property: JsonPropertyName("Rsens")] string Rsens,
[property: JsonPropertyName("IdrvMx")] string IdrvMx,
[property: JsonPropertyName("VinGain")] string VinGain,
[property: JsonPropertyName("DataRejectSigma")] string DataRejectSigma,
[property: JsonPropertyName("MeritThreshold")] string MeritThreshold,
[property: JsonPropertyName("PrbChg#")] string PrbChgNumber,
[property: JsonPropertyName("PrbName")] string PrbName)
{
internal static Line8 Get(string[] segments)
{
Line8 result;
ReadOnlyCollection<string> collection = Complete.GetCollection(segments);
result = new(collection.Count < 1 ? string.Empty : collection[0],
collection.Count < 2 ? string.Empty : collection[1],
collection.Count < 3 ? string.Empty : collection[2],
collection.Count < 4 ? string.Empty : collection[3],
collection.Count < 5 ? string.Empty : collection[4],
collection.Count < 6 ? string.Empty : collection[5],
collection.Count < 7 ? string.Empty : collection[6],
collection.Count < 8 ? string.Empty : collection[7],
collection.Count < 9 ? string.Empty : collection[8],
collection.Count < 10 ? string.Empty : collection[9]);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Line8))]
private partial class Line8SourceGenerationContext : JsonSerializerContext
{
}
private record Line9([property: JsonPropertyName("WaferSize")] string WaferSize,
[property: JsonPropertyName("EdgeEx")] string EdgeEx,
[property: JsonPropertyName("xll")] string Xll,
[property: JsonPropertyName("yll")] string Yll,
[property: JsonPropertyName("xur")] string Xur,
[property: JsonPropertyName("yur")] string Yur,
[property: JsonPropertyName("x")] string X,
[property: JsonPropertyName("y")] string Y,
[property: JsonPropertyName("CutCorners")] string CutCorners)
{
internal static Line9 Get(string[] segments)
{
Line9 result;
ReadOnlyCollection<string> collection = Complete.GetCollection(segments);
result = new(collection.Count < 1 ? string.Empty : collection[0],
collection.Count < 2 ? string.Empty : collection[1],
collection.Count < 3 ? string.Empty : collection[2],
collection.Count < 4 ? string.Empty : collection[3],
collection.Count < 5 ? string.Empty : collection[4],
collection.Count < 6 ? string.Empty : collection[5],
collection.Count < 7 ? string.Empty : collection[6],
collection.Count < 8 ? string.Empty : collection[7],
collection.Count < 9 ? string.Empty : collection[8]);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Line9))]
private partial class Line9SourceGenerationContext : JsonSerializerContext
{
}
private record Line10([property: JsonPropertyName("Diam ThScan")] string DiamThScan,
[property: JsonPropertyName("Diam Start")] string DiamStart,
[property: JsonPropertyName("Diam End")] string DiamEnd,
[property: JsonPropertyName("Diam Step")] string DiamStep)
{
internal static Line10 Get(string[] segments)
{
Line10 result;
ReadOnlyCollection<string> collection = Complete.GetCollection(segments);
result = new(collection.Count < 1 ? string.Empty : collection[0],
collection.Count < 2 ? string.Empty : collection[1],
collection.Count < 3 ? string.Empty : collection[2],
collection.Count < 4 ? string.Empty : collection[3]);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Line10))]
private partial class Line10SourceGenerationContext : JsonSerializerContext
{
}
private record Line11([property: JsonPropertyName("FlatOrNotch")] string FlatOrNotch,
[property: JsonPropertyName("FollowMajorFlat")] string FollowMajorFlat,
[property: JsonPropertyName("AutoOrManualLoad")] string AutoOrManualLoad,
[property: JsonPropertyName("RangeOrIndvdual")] string RangeOrIndividual,
[property: JsonPropertyName("PauseAfterEveryRun")] string PauseAfterEveryRun,
[property: JsonPropertyName("AutoPrint")] string AutoPrint,
[property: JsonPropertyName("Plot")] string Plot,
[property: JsonPropertyName("BulkSmplThk")] string BulkSampleThk,
[property: JsonPropertyName("Unit")] string Unit)
{
internal static Line11 Get(string[] segments)
{
Line11 result;
ReadOnlyCollection<string> collection = Complete.GetCollection(segments);
result = new(collection.Count < 1 ? string.Empty : collection[0],
collection.Count < 2 ? string.Empty : collection[1],
collection.Count < 3 ? string.Empty : collection[2],
collection.Count < 4 ? string.Empty : collection[3],
collection.Count < 5 ? string.Empty : collection[4],
collection.Count < 6 ? string.Empty : collection[5],
collection.Count < 7 ? string.Empty : collection[6],
collection.Count < 8 ? string.Empty : collection[7],
collection.Count < 9 ? string.Empty : collection[8]);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Line11))]
private partial class Line11SourceGenerationContext : JsonSerializerContext
{
}
private record Line12([property: JsonPropertyName("RangeFrom")] string RangeFrom,
[property: JsonPropertyName("RangeTo")] string RangeTo)
{
internal static Line12 Get(string[] segments)
{
Line12 result;
ReadOnlyCollection<string> collection = Complete.GetCollection(segments);
result = new(collection.Count < 1 ? string.Empty : collection[0],
collection.Count < 2 ? string.Empty : collection[1]);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Line12))]
private partial class Line12SourceGenerationContext : JsonSerializerContext
{
}
private record Line13([property: JsonPropertyName("CassSlotSelected")] string CassetteSlotSelected)
{
internal static Line13 Get(string[] segments)
{
Line13 result;
ReadOnlyCollection<string> collection = Complete.GetCollection(segments);
result = new(collection.Count < 1 ? string.Empty : collection[0]);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Line13))]
private partial class Line13SourceGenerationContext : JsonSerializerContext
{
}
private record Point([property: JsonPropertyName("R")] string R,
[property: JsonPropertyName("Th")] string Th,
[property: JsonPropertyName("Data")] string Data,
[property: JsonPropertyName("Rs")] string Rs,
[property: JsonPropertyName("RsA")] string RsA,
[property: JsonPropertyName("RsB")] string RsB,
[property: JsonPropertyName("#Smpl")] string NumberSample,
[property: JsonPropertyName("x")] string X,
[property: JsonPropertyName("y")] string Y,
[property: JsonPropertyName("Irng")] string Irng,
[property: JsonPropertyName("Vrng")] string Vrng,
[property: JsonPropertyName("ChiSq")] string ChiSq,
[property: JsonPropertyName("merit/GOF")] string MeritGOF,
[property: JsonPropertyName("DataIntegrity")] string DataIntegrity)
{
internal static ReadOnlyCollection<Point> Get(int take, ReadOnlyCollection<string> lines, string[] separator)
{
List<Point> results = [];
Point point;
string[] segments;
ReadOnlyCollection<string> collection;
for (int i = take - 1; i < lines.Count; i++)
{
if (string.IsNullOrEmpty(lines[i]))
break;
segments = lines[i].Split(separator, StringSplitOptions.RemoveEmptyEntries);
collection = Complete.GetCollection(segments);
point = new(collection.Count < 1 ? string.Empty : collection[0],
collection.Count < 2 ? string.Empty : collection[1],
collection.Count < 3 ? string.Empty : collection[2],
collection.Count < 4 ? string.Empty : collection[3],
collection.Count < 5 ? string.Empty : collection[4],
collection.Count < 6 ? string.Empty : collection[5],
collection.Count < 7 ? string.Empty : collection[6],
collection.Count < 8 ? string.Empty : collection[7],
collection.Count < 9 ? string.Empty : collection[8],
collection.Count < 10 ? string.Empty : collection[9],
collection.Count < 11 ? string.Empty : collection[10],
collection.Count < 12 ? string.Empty : collection[11],
collection.Count < 13 ? string.Empty : collection[12],
collection.Count < 14 ? string.Empty : collection[13]);
results.Add(point);
}
return new(results);
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Point))]
private partial class PointSourceGenerationContext : JsonSerializerContext
{
}
internal static void GetComplete(ILogger<Worker> logger, List<string> args)
{
string searchPattern = args[2];
string sourceDirectory = Path.GetFullPath(args[0]);
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.TopDirectoryOnly);
if (files.Length != 1)
logger.LogError("No files found in {sourceDirectory} with search pattern {searchPattern}", sourceDirectory, searchPattern);
else
{
int take = 14;
string[] lines = File.ReadAllLines(files[0]);
if (lines.Length < take)
logger.LogError("File {files[0]} has less than {take} lines", files[0], take);
else
{
ReadOnlyCollection<string> collection = new(lines);
Complete? complete = Complete.Get(take, collection);
if (complete is null)
logger.LogError("Could not get Complete from {files[0]}", files[0]);
else
{
string json = JsonSerializer.Serialize(complete, CompleteSourceGenerationContext.Default.Complete);
File.WriteAllText($"{files[0]}.json", json);
}
}
}
}
#else
internal static void GetComplete(ILogger<Worker> logger, List<string> args)
{
string searchPattern = args[2];
string sourceDirectory = Path.GetFullPath(args[0]);
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.TopDirectoryOnly);
if (files.Length != 1)
logger.LogError("No files found in {sourceDirectory} with search pattern {searchPattern}", sourceDirectory, searchPattern);
logger.LogError("GetComplete is not available in CDE {args[1]}", args[1]);
}
#endif
}

View File

@ -0,0 +1,24 @@
#if WorkItems
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3.WIQL;
public class Column
{
[JsonConstructor]
public Column(
string referenceName,
string name,
string url
)
{
ReferenceName = referenceName;
Name = name;
Url = url;
}
public string ReferenceName { get; set; } // { init; get; }
public string Name { get; set; } // { init; get; }
public string Url { get; set; } // { init; get; }
}
#endif

24
ADO2024/PI3/WIQL/Field.cs Normal file
View File

@ -0,0 +1,24 @@
#if WorkItems
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3.WIQL;
public class Field
{
[JsonConstructor]
public Field(
string referenceName,
string name,
string url
)
{
ReferenceName = referenceName;
Name = name;
Url = url;
}
public string ReferenceName { get; set; } // { init; get; }
public string Name { get; set; } // { init; get; }
public string Url { get; set; } // { init; get; }
}
#endif

39
ADO2024/PI3/WIQL/Root.cs Normal file
View File

@ -0,0 +1,39 @@
#if WorkItems
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3.WIQL;
public class Root
{
[JsonConstructor]
public Root(
string queryType,
string queryResultType,
DateTime asOf,
Column[] columns,
SortColumn[] sortColumns,
WorkItem[] workItems
)
{
QueryType = queryType;
QueryResultType = queryResultType;
AsOf = asOf;
Columns = columns;
SortColumns = sortColumns;
WorkItems = workItems;
}
public string QueryType { get; set; } // { init; get; }
public string QueryResultType { get; set; } // { init; get; }
public DateTime AsOf { get; set; } // { init; get; }
public Column[] Columns { get; set; } // { init; get; }
public SortColumn[] SortColumns { get; set; } // { init; get; }
public WorkItem[] WorkItems { get; set; } // { init; get; }
}
[JsonSourceGenerationOptions(WriteIndented = true, DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull, PropertyNameCaseInsensitive = true)]
[JsonSerializable(typeof(Root))]
internal partial class WIQLRootSourceGenerationContext : JsonSerializerContext
{
}
#endif

View File

@ -0,0 +1,21 @@
#if WorkItems
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3.WIQL;
public class SortColumn
{
[JsonConstructor]
public SortColumn(
Field field,
bool descending
)
{
Field = field;
Descending = descending;
}
public Field Field { get; set; } // { init; get; }
public bool Descending { get; set; } // { init; get; }
}
#endif

View File

@ -0,0 +1,21 @@
#if WorkItems
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3.WIQL;
public class WorkItem
{
[JsonConstructor]
public WorkItem(
int id,
string url
)
{
Id = id;
Url = url;
}
public int Id { get; set; } // { init; get; }
public string Url { get; set; } // { init; get; }
}
#endif

View File

@ -0,0 +1,15 @@
#if WorkItems
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3.WorkItems;
public class Avatar
{
[JsonConstructor]
public Avatar(
string href
) => Href = href;
public string Href { get; } // { init; get; }
}
#endif

View File

@ -0,0 +1,29 @@
#if WorkItems
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3.WorkItems;
public class CommentVersionRef
{
[JsonConstructor]
public CommentVersionRef(
int commentId,
int version,
string url
)
{
CommentId = commentId;
Version = version;
URL = url;
}
[JsonPropertyName("commentId")]
public int CommentId { get; } // { init; get; }
[JsonPropertyName("version")]
public int Version { get; } // { init; get; }
[JsonPropertyName("url")]
public string URL { get; } // { init; get; }
}
#endif

View File

@ -0,0 +1,36 @@
#if WorkItems
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3.WorkItems;
public class CustomRequester
{
[JsonConstructor]
public CustomRequester(
string descriptor,
string displayName,
string id,
string imageUrl,
Links links,
string uniqueName,
string url
)
{
Descriptor = descriptor;
DisplayName = displayName;
Id = id;
ImageUrl = imageUrl;
Links = links;
UniqueName = uniqueName;
Url = url;
}
[JsonPropertyName("descriptor")] public string Descriptor { get; }
[JsonPropertyName("displayName")] public string DisplayName { get; }
[JsonPropertyName("id")] public string Id { get; }
[JsonPropertyName("imageUrl")] public string ImageUrl { get; }
[JsonPropertyName("_links")] public Links Links { get; }
[JsonPropertyName("uniqueName")] public string UniqueName { get; }
[JsonPropertyName("url")] public string Url { get; }
}
#endif

View File

@ -0,0 +1,101 @@
#if WorkItems
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3.WorkItems;
public class Fields
{
[JsonConstructor]
public Fields(int customRRminusOE,
CustomRequester? customRequester,
float customWSJF,
int microsoftVSTSCommonBusinessValue,
DateTime microsoftVSTSCommonClosedDate,
int microsoftVSTSCommonPriority,
DateTime microsoftVSTSCommonResolvedDate,
DateTime microsoftVSTSCommonStateChangeDate,
float microsoftVSTSCommonTimeCriticality,
float? microsoftVSTSSchedulingEffort,
DateTime microsoftVSTSSchedulingStartDate,
DateTime microsoftVSTSSchedulingTargetDate,
string systemAreaPath,
SystemAssignedTo systemAssignedTo,
SystemChangedBy systemChangedBy,
DateTime systemChangedDate,
int systemCommentCount,
SystemCreatedBy systemCreatedBy,
DateTime systemCreatedDate,
string systemDescription,
string systemHistory,
string systemIterationPath,
int systemParent,
string systemReason,
string systemState,
string systemTags,
string systemTeamProject,
string systemTitle,
string systemWorkItemType)
{
CustomRequester = customRequester;
CustomRRminusOE = customRRminusOE;
CustomWSJF = customWSJF;
MicrosoftVSTSCommonBusinessValue = microsoftVSTSCommonBusinessValue;
MicrosoftVSTSCommonClosedDate = microsoftVSTSCommonClosedDate;
MicrosoftVSTSCommonPriority = microsoftVSTSCommonPriority;
MicrosoftVSTSCommonResolvedDate = microsoftVSTSCommonResolvedDate;
MicrosoftVSTSCommonStateChangeDate = microsoftVSTSCommonStateChangeDate;
MicrosoftVSTSCommonTimeCriticality = microsoftVSTSCommonTimeCriticality;
MicrosoftVSTSSchedulingEffort = microsoftVSTSSchedulingEffort;
MicrosoftVSTSSchedulingStartDate = microsoftVSTSSchedulingStartDate;
MicrosoftVSTSSchedulingTargetDate = microsoftVSTSSchedulingTargetDate;
SystemAreaPath = systemAreaPath;
SystemAssignedTo = systemAssignedTo;
SystemChangedBy = systemChangedBy;
SystemChangedDate = systemChangedDate;
SystemCommentCount = systemCommentCount;
SystemCreatedBy = systemCreatedBy;
SystemCreatedDate = systemCreatedDate;
SystemDescription = systemDescription;
SystemHistory = systemHistory;
SystemIterationPath = systemIterationPath;
SystemParent = systemParent;
SystemReason = systemReason;
SystemState = systemState;
SystemTags = systemTags;
SystemTeamProject = systemTeamProject;
SystemTitle = systemTitle;
SystemWorkItemType = systemWorkItemType;
}
[JsonPropertyName("Custom.Requester")] public CustomRequester? CustomRequester { get; } // { init; get; }
[JsonPropertyName("Custom.RRminusOE")] public int CustomRRminusOE { get; } // { init; get; }
[JsonPropertyName("Custom.WSJF")] public float CustomWSJF { get; } // { init; get; }
[JsonPropertyName("Microsoft.VSTS.Common.BusinessValue")] public int MicrosoftVSTSCommonBusinessValue { get; } // { init; get; }
[JsonPropertyName("Microsoft.VSTS.Common.ClosedDate")] public DateTime MicrosoftVSTSCommonClosedDate { get; } // { init; get; }
[JsonPropertyName("Microsoft.VSTS.Common.Priority")] public int MicrosoftVSTSCommonPriority { get; } // { init; get; }
[JsonPropertyName("Microsoft.VSTS.Common.ResolvedDate")] public DateTime MicrosoftVSTSCommonResolvedDate { get; } // { init; get; }
[JsonPropertyName("Microsoft.VSTS.Common.StateChangeDate")] public DateTime MicrosoftVSTSCommonStateChangeDate { get; } // { init; get; }
[JsonPropertyName("Microsoft.VSTS.Common.TimeCriticality")] public float MicrosoftVSTSCommonTimeCriticality { get; } // { init; get; }
[JsonPropertyName("Microsoft.VSTS.Scheduling.Effort")] public float? MicrosoftVSTSSchedulingEffort { get; } // { init; get; }
[JsonPropertyName("Microsoft.VSTS.Scheduling.StartDate")] public DateTime MicrosoftVSTSSchedulingStartDate { get; } // { init; get; }
[JsonPropertyName("Microsoft.VSTS.Scheduling.TargetDate")] public DateTime MicrosoftVSTSSchedulingTargetDate { get; } // { init; get; }
[JsonPropertyName("System.AreaPath")] public string SystemAreaPath { get; } // { init; get; }
[JsonPropertyName("System.AssignedTo")] public SystemAssignedTo? SystemAssignedTo { get; } // { init; get; }
[JsonPropertyName("System.ChangedBy")] public SystemChangedBy SystemChangedBy { get; } // { init; get; }
[JsonPropertyName("System.ChangedDate")] public DateTime SystemChangedDate { get; } // { init; get; }
[JsonPropertyName("System.CommentCount")] public int SystemCommentCount { get; } // { init; get; }
[JsonPropertyName("System.CreatedBy")] public SystemCreatedBy SystemCreatedBy { get; } // { init; get; }
[JsonPropertyName("System.CreatedDate")] public DateTime SystemCreatedDate { get; } // { init; get; }
[JsonPropertyName("System.Description")] public string SystemDescription { get; } // { init; get; }
[JsonPropertyName("System.History")] public string SystemHistory { get; } // { init; get; }
[JsonPropertyName("System.IterationPath")] public string SystemIterationPath { get; } // { init; get; }
[JsonPropertyName("System.Parent")] public int SystemParent { get; } // { init; get; }
[JsonPropertyName("System.Reason")] public string SystemReason { get; } // { init; get; }
[JsonPropertyName("System.State")] public string SystemState { get; } // { init; get; }
[JsonPropertyName("System.Tags")] public string SystemTags { get; } // { init; get; }
[JsonPropertyName("System.TeamProject")] public string SystemTeamProject { get; } // { init; get; }
[JsonPropertyName("System.Title")] public string SystemTitle { get; } // { init; get; }
[JsonPropertyName("System.WorkItemType")] public string SystemWorkItemType { get; } // { init; get; }
}
#endif

View File

@ -0,0 +1,15 @@
#if WorkItems
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3.WorkItems;
public class Html
{
[JsonConstructor]
public Html(
string href
) => Href = href;
public string Href { get; } // { init; get; }
}
#endif

View File

@ -0,0 +1,16 @@
#if WorkItems
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3.WorkItems;
public class Links
{
[JsonConstructor]
public Links(
Avatar avatar
) => Avatar = avatar;
[JsonPropertyName("avatar")]
public Avatar Avatar { get; }
}
#endif

View File

@ -0,0 +1,49 @@
#if WorkItems
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3.WorkItems;
public class SystemAssignedTo
{
[JsonConstructor]
public SystemAssignedTo(
string displayName,
string url,
Links links,
string id,
string uniqueName,
string imageUrl,
string descriptor
)
{
DisplayName = displayName;
Url = url;
Links = links;
Id = id;
UniqueName = uniqueName;
ImageUrl = imageUrl;
Descriptor = descriptor;
}
[JsonPropertyName("displayName")]
public string DisplayName { get; }
[JsonPropertyName("url")]
public string Url { get; }
[JsonPropertyName("_links")]
public Links Links { get; }
[JsonPropertyName("id")]
public string Id { get; }
[JsonPropertyName("uniqueName")]
public string UniqueName { get; }
[JsonPropertyName("imageUrl")]
public string ImageUrl { get; }
[JsonPropertyName("descriptor")]
public string Descriptor { get; }
}
#endif

View File

@ -0,0 +1,49 @@
#if WorkItems
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3.WorkItems;
public class SystemChangedBy
{
[JsonConstructor]
public SystemChangedBy(
string displayName,
string url,
Links links,
string id,
string uniqueName,
string imageUrl,
string descriptor
)
{
DisplayName = displayName;
Url = url;
Links = links;
Id = id;
UniqueName = uniqueName;
ImageUrl = imageUrl;
Descriptor = descriptor;
}
[JsonPropertyName("displayName")]
public string DisplayName { get; }
[JsonPropertyName("url")]
public string Url { get; }
[JsonPropertyName("_links")]
public Links Links { get; }
[JsonPropertyName("id")]
public string Id { get; }
[JsonPropertyName("uniqueName")]
public string UniqueName { get; }
[JsonPropertyName("imageUrl")]
public string ImageUrl { get; }
[JsonPropertyName("descriptor")]
public string Descriptor { get; }
}
#endif

View File

@ -0,0 +1,49 @@
#if WorkItems
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3.WorkItems;
public class SystemCreatedBy
{
[JsonConstructor]
public SystemCreatedBy(
string displayName,
string url,
Links links,
string id,
string uniqueName,
string imageUrl,
string descriptor
)
{
DisplayName = displayName;
Url = url;
Links = links;
Id = id;
UniqueName = uniqueName;
ImageUrl = imageUrl;
Descriptor = descriptor;
}
[JsonPropertyName("displayName")]
public string DisplayName { get; }
[JsonPropertyName("url")]
public string Url { get; }
[JsonPropertyName("_links")]
public Links Links { get; }
[JsonPropertyName("id")]
public string Id { get; }
[JsonPropertyName("uniqueName")]
public string UniqueName { get; }
[JsonPropertyName("imageUrl")]
public string ImageUrl { get; }
[JsonPropertyName("descriptor")]
public string Descriptor { get; }
}
#endif

View File

@ -0,0 +1,56 @@
#if WorkItems
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI3.WorkItems;
public class Value
{
[JsonConstructor]
public Value(
int id,
int rev,
Fields fields,
object[] relations,
CommentVersionRef commentVersionRef,
string url
)
{
Id = id;
Rev = rev;
Fields = fields;
Relations = relations;
CommentVersionRef = commentVersionRef;
Url = url;
}
[JsonPropertyName("id")]
public int Id { get; }
[JsonPropertyName("rev")]
public int Rev { get; }
[JsonPropertyName("fields")]
public Fields Fields { get; }
[JsonPropertyName("relations")]
public object[] Relations { get; }
[JsonPropertyName("commentVersionRef")]
public CommentVersionRef CommentVersionRef { get; }
[JsonPropertyName("url")]
public string Url { get; }
}
[JsonSourceGenerationOptions(WriteIndented = true, DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull, PropertyNameCaseInsensitive = true)]
[JsonSerializable(typeof(Value[]))]
internal partial class ValueCollectionSourceGenerationContext : JsonSerializerContext
{
}
[JsonSourceGenerationOptions(WriteIndented = true, DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull, PropertyNameCaseInsensitive = true)]
[JsonSerializable(typeof(Value))]
internal partial class ValueSourceGenerationContext : JsonSerializerContext
{
}
#endif

View File

@ -0,0 +1,21 @@
#if WorkItems
namespace File_Folder_Helper.ADO2024.PI3.WorkItems;
public class ValueWithReq
{
public ValueWithReq(
Value value,
int req,
string json
)
{
Value = value;
Req = req;
Json = json;
}
public Value Value { get; set; } // { init; get; }
public int Req { get; set; } // { init; get; }
public string Json { get; set; } // { init; get; }
}
#endif

View File

@ -0,0 +1,384 @@
using Microsoft.Extensions.Logging;
#if WorkItems
using System.Collections.ObjectModel;
using System.Text.Json;
using System.Text.Json.Serialization;
#endif
namespace File_Folder_Helper.ADO2024.PI4;
internal static partial class Helper20241108
{
#if WorkItems
private record Attribute([property: JsonPropertyName("isLocked")] bool IsLocked,
[property: JsonPropertyName("name")] string Name,
[property: JsonPropertyName("parameterTitle")] string? ParameterTitle,
[property: JsonPropertyName("state")] string? State,
[property: JsonPropertyName("workItemType")] string? WorkItemType);
private record Relation([property: JsonPropertyName("attributes")] Attribute Attributes,
[property: JsonPropertyName("id")] int Id,
[property: JsonPropertyName("rel")] string Rel);
private record WorkItem(DateTime? ActivatedDate,
string AreaPath,
string? AssignedTo,
long? BusinessValue,
DateTime ChangedDate,
DateTime? ClosedDate,
int CommentCount,
DateTime CreatedDate,
string Description,
long? Effort,
int Id,
string IterationPath,
int? Parent,
int? Priority,
Relation[]? Relations,
string? Requester,
DateTime? ResolvedDate,
int Revision,
long? RiskReductionMinusOpportunityEnablement,
DateTime? StartDate,
string State,
string Tags,
DateTime? TargetDate,
long? TimeCriticality,
string Title,
string? Violation,
long? WeightedShortestJobFirst,
string WorkItemType)
{
public override string ToString() => $"{Id} - {WorkItemType} - {Title}";
public static WorkItem Get(WorkItem workItem, Relation[] relations)
{
WorkItem result = new(workItem.ActivatedDate,
workItem.AreaPath,
workItem.AssignedTo,
workItem.BusinessValue,
workItem.ChangedDate,
workItem.ClosedDate,
workItem.CommentCount,
workItem.CreatedDate,
workItem.Description,
workItem.Effort,
workItem.Id,
workItem.IterationPath,
workItem.Parent,
workItem.Priority,
relations,
workItem.Requester,
workItem.ResolvedDate,
workItem.Revision,
workItem.RiskReductionMinusOpportunityEnablement,
workItem.StartDate,
workItem.State,
workItem.Tags,
workItem.TargetDate,
workItem.TimeCriticality,
workItem.Title,
workItem.Violation,
workItem.WeightedShortestJobFirst,
workItem.WorkItemType);
return result;
}
public static WorkItem? GetWithOutRelations(WorkItem? workItem)
{
WorkItem? result = workItem is null ? null : new(workItem.ActivatedDate,
workItem.AreaPath,
workItem.AssignedTo,
workItem.BusinessValue,
workItem.ChangedDate,
workItem.ClosedDate,
workItem.CommentCount,
workItem.CreatedDate,
workItem.Description,
workItem.Effort,
workItem.Id,
workItem.IterationPath,
workItem.Parent,
workItem.Priority,
Array.Empty<Relation>(),
workItem.Requester,
workItem.ResolvedDate,
workItem.Revision,
workItem.RiskReductionMinusOpportunityEnablement,
workItem.StartDate,
workItem.State,
workItem.Tags,
workItem.TargetDate,
workItem.TimeCriticality,
workItem.Title,
workItem.Violation,
workItem.WeightedShortestJobFirst,
workItem.WorkItemType);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(WorkItem))]
private partial class WorkItemSourceGenerationContext : JsonSerializerContext
{
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(WorkItem[]))]
private partial class WorkItemCollectionSourceGenerationContext : JsonSerializerContext
{
}
private record Record(WorkItem WorkItem, WorkItem? Parent, Record[]? Children, Record[]? Related, Record[]? Successors)
{
internal static Record GetWithoutNesting(Record record, string? violation)
{
Record result;
WorkItem workItem = new(record.WorkItem.ActivatedDate,
record.WorkItem.AreaPath,
record.WorkItem.AssignedTo,
record.WorkItem.BusinessValue,
record.WorkItem.ChangedDate,
record.WorkItem.ClosedDate,
record.WorkItem.CommentCount,
record.WorkItem.CreatedDate,
record.WorkItem.Description,
record.WorkItem.Effort,
record.WorkItem.Id,
record.WorkItem.IterationPath,
record.WorkItem.Parent,
record.WorkItem.Priority,
record.WorkItem.Relations,
record.WorkItem.Requester,
record.WorkItem.ResolvedDate,
record.WorkItem.Revision,
record.WorkItem.RiskReductionMinusOpportunityEnablement,
record.WorkItem.StartDate,
record.WorkItem.State,
record.WorkItem.Tags,
record.WorkItem.TargetDate,
record.WorkItem.TimeCriticality,
record.WorkItem.Title,
record.WorkItem.Violation is null ? violation : record.WorkItem.Violation,
record.WorkItem.WeightedShortestJobFirst,
record.WorkItem.WorkItemType);
result = new(workItem, record.Parent, null, null, null);
return result;
}
private static Record Get(Record record, bool keepRelations)
{
Record result;
Record[]? childRecords;
Record[]? relatedRecords;
Record[]? successorRecords;
List<Record> relationRecords;
WorkItem? parentWorkItem = keepRelations ? record.Parent : WorkItem.GetWithOutRelations(record.Parent);
WorkItem? workItem = keepRelations ? record.WorkItem : WorkItem.GetWithOutRelations(record.WorkItem) ?? throw new Exception();
if (record.Children is null)
childRecords = null;
else
{
relationRecords = [];
foreach (Record r in record.Children)
relationRecords.Add(Get(r, keepRelations));
childRecords = relationRecords.ToArray();
}
if (record.Related is null)
relatedRecords = null;
else
{
relationRecords = [];
foreach (Record r in record.Related)
relationRecords.Add(Get(r, keepRelations));
relatedRecords = relationRecords.ToArray();
}
if (record.Successors is null)
successorRecords = null;
else
{
relationRecords = [];
foreach (Record r in record.Successors)
relationRecords.Add(Get(r, keepRelations));
successorRecords = relationRecords.ToArray();
}
result = new(workItem, parentWorkItem, childRecords, relatedRecords, successorRecords);
return result;
}
internal static Record Get(WorkItem workItem, WorkItem? parent, ReadOnlyCollection<Record>? children, ReadOnlyCollection<Record>? related, ReadOnlyCollection<Record>? successors, bool keepRelations)
{
Record result;
Record record = new(workItem, parent, children?.ToArray(), related?.ToArray(), successors?.ToArray());
result = Get(record, keepRelations);
return result;
}
internal static ReadOnlyCollection<Record> GetKeyValuePairs(ReadOnlyDictionary<int, WorkItem> keyValuePairs, WorkItem workItem, string relationName, List<bool> nests, bool keepRelations)
{
List<Record> results = [];
Record record;
nests.Add(true);
WorkItem? parentWorkItem;
WorkItem? relationWorkItem;
List<WorkItem> collection = [];
ReadOnlyCollection<Record>? childRecords;
ReadOnlyCollection<Record>? relatedRecords;
ReadOnlyCollection<Record>? successorRecords;
if (workItem.Relations is not null && workItem.Relations.Length > 0)
{
collection.Clear();
foreach (Relation relation in workItem.Relations)
{
if (relation.Attributes.Name != relationName)
continue;
if (workItem.Parent is not null && relation.Id == workItem.Parent.Value)
continue;
if (!keyValuePairs.TryGetValue(relation.Id, out relationWorkItem))
continue;
collection.Add(relationWorkItem);
}
collection = (from l in collection orderby l.State != "Closed", l.Id select l).ToList();
foreach (WorkItem w in collection)
{
if (nests.Count > 500)
break;
if (w.Parent is null)
parentWorkItem = null;
else
_ = keyValuePairs.TryGetValue(w.Parent.Value, out parentWorkItem);
childRecords = GetKeyValuePairs(keyValuePairs, w, "Child", nests, keepRelations); // Forward
relatedRecords = null; // GetKeyValuePairs(keyValuePairs, w, "Related", nests, keepRelations); // Related
successorRecords = null; // GetKeyValuePairs(keyValuePairs, w, "Successor", nests, keepRelations); // Forward
// predecessorRecords = GetKeyValuePairs(keyValuePairs, w, "Predecessor", nests, keepRelations); // Reverse
record = Get(w, parentWorkItem, childRecords, relatedRecords, successorRecords, keepRelations);
results.Add(record);
}
}
return new(results);
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Record[]))]
private partial class RecordCollectionBCommonSourceGenerationContext : JsonSerializerContext
{
}
private static ReadOnlyDictionary<int, Record> GetKeyValuePairs(ReadOnlyDictionary<int, WorkItem> keyValuePairs, bool keepRelations)
{
Dictionary<int, Record> results = [];
Record record;
List<bool> nests = [];
WorkItem? parentWorkItem;
ReadOnlyCollection<Record> childRecords;
ReadOnlyCollection<Record> relatedRecords;
ReadOnlyCollection<Record> successorRecords;
foreach (KeyValuePair<int, WorkItem> keyValuePair in keyValuePairs)
{
nests.Clear();
if (keyValuePair.Value.Parent is null)
parentWorkItem = null;
else
_ = keyValuePairs.TryGetValue(keyValuePair.Value.Parent.Value, out parentWorkItem);
try
{
childRecords = Record.GetKeyValuePairs(keyValuePairs, keyValuePair.Value, "Child", nests, keepRelations); // Forward
// records = Record.GetKeyValuePairs(keyValuePairs, keyValuePair.Value, "Predecessor", nests, keepRelations); // Reverse
relatedRecords = Record.GetKeyValuePairs(keyValuePairs, keyValuePair.Value, "Related", nests, keepRelations); // Related
successorRecords = Record.GetKeyValuePairs(keyValuePairs, keyValuePair.Value, "Successor", nests, keepRelations); // Forward
record = Record.Get(keyValuePair.Value, parentWorkItem, childRecords, relatedRecords, successorRecords, keepRelations);
}
catch (Exception)
{
record = new(keyValuePair.Value, parentWorkItem, [], [], []);
}
results.Add(keyValuePair.Key, record);
}
return new(results);
}
private static ReadOnlyDictionary<int, Record> GetWorkItems(ReadOnlyCollection<WorkItem> workItems, bool keepRelations)
{
ReadOnlyDictionary<int, Record> results;
Dictionary<int, WorkItem> keyValuePairs = [];
foreach (WorkItem workItem in workItems)
keyValuePairs.Add(workItem.Id, workItem);
results = GetKeyValuePairs(new(keyValuePairs), keepRelations);
return results;
}
private static ReadOnlyCollection<WorkItem>? GetWorkItems(string fileName, string sourceDirectory)
{
WorkItem[]? results;
string? checkFile = null;
string? checkDirectory = sourceDirectory;
string? pathRoot = Path.GetPathRoot(sourceDirectory);
for (int i = 0; i < int.MaxValue; i++)
{
checkDirectory = Path.GetDirectoryName(checkDirectory);
if (string.IsNullOrEmpty(checkDirectory) || checkDirectory == pathRoot)
break;
checkFile = Path.Combine(checkDirectory, fileName);
if (File.Exists(checkFile))
break;
checkFile = null;
}
if (checkFile is null)
results = null;
else
{
string json = File.ReadAllText(checkFile);
results = JsonSerializer.Deserialize(json, WorkItemCollectionSourceGenerationContext.Default.WorkItemArray);
}
return results is null ? null : new(results);
}
internal static void WriteMarkdown(ILogger<Worker> logger, List<string> args)
{
// string url = args[5];
bool keepRelations = true;
string fileName = args[2];
string sourceDirectory = Path.GetFullPath(args[0]);
string sourceDirectoryName = Path.GetFileName(sourceDirectory);
string idCheck = sourceDirectoryName.Split('-', StringSplitOptions.None)[0];
if (!int.TryParse(idCheck, out int id))
logger.LogInformation("Not valid directory!");
else
{
ReadOnlyCollection<WorkItem>? workItems = GetWorkItems(fileName, sourceDirectory);
if (workItems is null)
logger.LogInformation("No file found!");
else
{
Record? record;
ReadOnlyDictionary<int, Record> keyValuePairs = GetWorkItems(workItems, keepRelations);
logger.LogInformation("Found {keyValuePairs}", keyValuePairs.Count);
if (!keyValuePairs.TryGetValue(id, out record))
logger.LogInformation($"Id {id} not found!");
else
{
logger.LogInformation($"Id {id} found with title {record.WorkItem.Title}!");
}
}
}
}
#else
internal static void WriteMarkdown(ILogger<Worker> logger, List<string> args)
{
logger.LogError("WriteMarkdown is not available in WorkItems {args[0]}", args[0]);
logger.LogError("WriteMarkdown is not available in WorkItems {args[1]}", args[1]);
}
#endif
}

View File

@ -0,0 +1,62 @@
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2024.PI4;
internal static partial class Helper20241115
{
#if BIORAD
internal static void ScanPast(string text, int[] i, string search)
{
int num = text.IndexOf(search, i[0]);
if (num > -1)
i[0] = num + search.Length;
else
i[0] = text.Length;
}
internal static void GetComplete(ILogger<Worker> logger, List<string> args)
{
string searchPattern = args[2];
string sourceDirectory = Path.GetFullPath(args[0]);
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.TopDirectoryOnly);
if (files.Length != 1)
logger.LogError("No files found in {sourceDirectory} with search pattern {searchPattern}", sourceDirectory, searchPattern);
else
{
List<string> group = [];
string text = File.ReadAllText(files[0]);
int[] i = [0];
ScanPast(text, i, "Recipe ID:");
ScanPast(text, i, "*");
#pragma warning disable IDE0057
string[] segments = text.Substring(i[0]).Split('*');
string[] segmentsB;
string[] segmentsC;
foreach (string segment in segments)
{
segmentsB = segment.Split(Environment.NewLine);
segmentsC = segmentsB[0].Split(' ');
if (segment.Contains("Group"))
{
}
}
}
}
#else
internal static void GetComplete(ILogger<Worker> logger, List<string> args)
{
string searchPattern = args[2];
string sourceDirectory = Path.GetFullPath(args[0]);
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.TopDirectoryOnly);
if (files.Length != 1)
logger.LogError("No files found in {sourceDirectory} with search pattern {searchPattern}", sourceDirectory, searchPattern);
logger.LogError("GetComplete is not available in BioRad {args[1]}", args[1]);
}
#endif
}

View File

@ -0,0 +1,24 @@
using Microsoft.Extensions.Logging;
using System.Text;
namespace File_Folder_Helper.ADO2024.PI4;
internal static partial class Helper20241204
{
internal static void ConvertToUTF8(ILogger<Worker> logger, List<string> args)
{
string text;
string searchPattern = args[2];
string sourceDirectory = Path.GetFullPath(args[0]);
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.TopDirectoryOnly);
if (files.Length == 0)
logger.LogError("No files found in {sourceDirectory} with search pattern {searchPattern}", sourceDirectory, searchPattern);
foreach (string file in files)
{
text = File.ReadAllText(file);
File.WriteAllText(file, text, Encoding.UTF8);
}
}
}

View File

@ -0,0 +1,35 @@
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2024.PI4;
internal static partial class Helper20241212
{
internal static void Rename(ILogger<Worker> logger, List<string> args)
{
string newFile;
string fileName;
string newFileName;
string directoryName;
string searchPattern = args[2];
string[] searchPatterns = args[3].Split('~');
string sourceDirectory = Path.GetFullPath(args[0]);
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.AllDirectories);
if (files.Length == 0)
logger.LogError("No files found in {sourceDirectory} with search pattern {searchPattern}", sourceDirectory, searchPattern);
foreach (string file in files)
{
fileName = Path.GetFileName(file);
directoryName = Path.GetDirectoryName(file) ?? throw new Exception();
newFileName = fileName;
foreach (string pattern in searchPatterns)
newFileName = newFileName.Replace(pattern, "");
newFile = Path.Combine(directoryName, newFileName);
if (File.Exists(newFile))
logger.LogError("File {newFile} already exists", newFile);
else
File.Move(file, newFile);
}
}
}

View File

@ -0,0 +1,339 @@
using DiscUtils.Iso9660;
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
using System.Diagnostics;
using System.IO.Compression;
using System.Text.Json;
using System.Text.Json.Serialization;
namespace File_Folder_Helper.ADO2024.PI4;
internal static partial class Helper20241217
{
private record SecureShell(
);
private record ServerMessageBlock(string Path,
bool Required);
private record Target(SecureShell? SecureShell,
ServerMessageBlock? ServerMessageBlock);
private record File(long LastWriteTicks,
long Length,
string RelativePath);
private record Record(string Directory,
Job Job,
string Path);
private record Job(string? AlternatePath,
string Directory,
string Extension,
File[] Files,
int FilesCount,
double FilesTotalLength,
int Keep,
Target[] Targets);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Job))]
private partial class JobSourceGenerationContext : JsonSerializerContext
{
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(File[]))]
private partial class FilesSourceGenerationContext : JsonSerializerContext
{
}
internal static void Backup(ILogger<Worker> logger, List<string> args)
{
Job jobNew;
string path;
string? json;
string asidePath;
bool areTheyTheSame;
string directoryName;
IEnumerable<Record> records;
logger.LogInformation(args[0]);
logger.LogInformation(args[1]);
logger.LogInformation(args[2]);
logger.LogInformation(args[3]);
logger.LogInformation(args[4]);
ReadOnlyCollection<File> files;
string searchPattern = args[2];
IEnumerable<string> searchPatternFiles;
string[] ignoreFileNames = args[3].Split('~');
string destination = Path.GetFullPath(args[4]);
string sourceDirectory = Path.GetFullPath(args[0]);
char destinationDriveLetter = destination.Split(':')[0][0];
logger.LogInformation("Searching <{sourceDirectory}> with search pattern {searchPattern}", args[0], searchPattern);
if (Debugger.IsAttached)
Verify(searchPattern, ignoreFileNames);
for (int i = 1; i < 3; i++)
{
if (i == 1)
{
searchPatternFiles = Directory.EnumerateFiles(sourceDirectory, searchPattern, new EnumerationOptions { IgnoreInaccessible = true, RecurseSubdirectories = true });
}
else if (i == 2)
{
searchPatternFiles = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.AllDirectories);
}
else
throw new NotImplementedException();
records = GetRecords(sourceDirectory, searchPatternFiles);
foreach (Record record in records)
{
if (record.Job is null || string.IsNullOrEmpty(record.Job.Extension))
continue;
logger.LogInformation("Searching <{directory}>", record.Directory);
files = GetFiles(searchPattern, ignoreFileNames, record);
jobNew = GetJob(searchPattern, ignoreFileNames, record, files);
json = JsonSerializer.Serialize(jobNew, JobSourceGenerationContext.Default.Job);
areTheyTheSame = GetAreTheyTheSame(logger, searchPattern, ignoreFileNames, record, jobNew);
if (areTheyTheSame)
{
WriteAllText(record.Path, json);
continue;
}
directoryName = Path.GetFileName(record.Directory);
asidePath = Path.Combine(record.Directory, $"{directoryName}-{DateTime.Now:yyyy-MM-dd-HH-mm-ss-fff}{record.Job.Extension}");
path = $"{destinationDriveLetter}{asidePath[1..]}";
logger.LogInformation("Writing <{path}> extension", path);
WritePassedExtension(record, files, directoryName, path);
logger.LogInformation("Wrote <{path}> extension", path);
MovePassedExtension(destination, path);
logger.LogInformation("Moved <{path}> extension", path);
WriteAllText(record.Path, json);
Helpers.HelperDeleteEmptyDirectories.DeleteEmptyDirectories(logger, $"{destinationDriveLetter}{record.Directory[1..]}");
}
}
}
private static void Verify(string searchPattern, string[] ignoreFileNames)
{
List<Target> targets = [
new(new SecureShell(), null),
new(null, new ServerMessageBlock("\\\\mesfs.infineon.com\\EC_APC\\DEV", true))
];
string directory = Path.Combine(Environment.CurrentDirectory, ".vscode", "helper");
if (!Directory.Exists(directory))
_ = Directory.CreateDirectory(directory);
string path = Path.Combine(directory, "verify.json");
ReadOnlyCollection<File> files = GetFiles(directory, searchPattern, ignoreFileNames);
ReadOnlyCollection<File> collection = GetFilteredFiles(searchPattern, ignoreFileNames, files);
double filesTotalLength = collection.Select(l => l.Length).Sum();
Job job = new(AlternatePath: "C:/Users/phares",
Directory: directory,
Extension: ".iso",
Files: collection.ToArray(),
FilesCount: files.Count,
FilesTotalLength: filesTotalLength,
Keep: 3,
Targets: targets.ToArray());
string json = JsonSerializer.Serialize(job, JobSourceGenerationContext.Default.Job);
WriteAllText(path, json);
}
private static ReadOnlyCollection<File> GetFilteredFiles(string searchPattern, string[] ignoreFileNames, ReadOnlyCollection<File> files)
{
List<File> results = [];
string fileName;
foreach (File file in files)
{
if (file.RelativePath == searchPattern)
continue;
fileName = Path.GetFileName(file.RelativePath);
if (fileName == searchPattern)
throw new Exception("Found nested file!");
if (ignoreFileNames.Any(l => l == fileName))
continue;
if (file.Length == 0)
continue;
results.Add(file);
}
return results.AsReadOnly();
}
private static IEnumerable<Record> GetRecords(string directory, IEnumerable<string> files)
{
Job? job;
string json;
Record record;
string fileName;
string directoryName;
foreach (string file in files)
{
fileName = Path.GetFileName(file);
directoryName = Path.GetDirectoryName(file) ?? throw new Exception();
if (!fileName.StartsWith('.'))
{
System.IO.File.Move(file, Path.Combine(directoryName, $".{fileName}"));
continue;
}
json = System.IO.File.ReadAllText(file);
if (string.IsNullOrEmpty(json) || json is "{}" or "[]")
job = null;
else
job = JsonSerializer.Deserialize(json, JobSourceGenerationContext.Default.Job);
job ??= new(AlternatePath: null,
Directory: directory,
Extension: ".iso",
Files: [],
FilesCount: 0,
FilesTotalLength: 0,
Keep: 3,
Targets: []);
record = new(Directory: directoryName, Job: job, Path: file);
yield return record;
}
}
private static ReadOnlyCollection<File> GetFiles(string directory, string searchPattern, string[] ignoreFileNames)
{
List<File> results = [];
File file;
string relativePath;
string[] files = Directory.GetFiles(directory, "*", SearchOption.AllDirectories);
FileInfo[] fileInfoCollection = files.Select(l => new FileInfo(l)).ToArray();
foreach (FileInfo fileInfo in fileInfoCollection)
{
if (fileInfo.Name == searchPattern)
continue;
if (ignoreFileNames.Any(l => l == fileInfo.Name))
continue;
if (!string.IsNullOrEmpty(fileInfo.LinkTarget))
continue;
relativePath = Path.GetRelativePath(directory, fileInfo.FullName).Replace(';', '_');
if (relativePath.StartsWith(".."))
relativePath = relativePath[3..];
file = new(LastWriteTicks: fileInfo.LastWriteTime.Ticks, Length: fileInfo.Length, RelativePath: relativePath);
results.Add(file);
}
return results.AsReadOnly();
}
private static ReadOnlyCollection<File> GetFiles(string searchPattern, string[] ignoreFileNames, Record record) =>
GetFiles(record.Directory, searchPattern, ignoreFileNames);
private static Job GetJob(string searchPattern, string[] ignoreFileNames, Record record, ReadOnlyCollection<File> files)
{
Job result;
ReadOnlyCollection<File> collection = GetFilteredFiles(searchPattern, ignoreFileNames, files);
double filesTotalLengthNew = collection.Select(l => l.Length).Sum();
result = new(AlternatePath: record.Job.AlternatePath,
Directory: record.Directory,
Extension: record.Job.Extension,
Files: collection.ToArray(),
FilesCount: collection.Count,
FilesTotalLength: filesTotalLengthNew,
Keep: record.Job.Keep,
Targets: record.Job.Targets);
return result;
}
private static bool GetAreTheyTheSame(ILogger<Worker> logger, string searchPattern, string[] ignoreFileNames, Record record, Job jobNew)
{
bool result;
ReadOnlyCollection<File> collection = GetFilteredFiles(searchPattern, ignoreFileNames, record.Job.Files.AsReadOnly());
int filesCountOld = collection.Count;
int filesCountNew = jobNew.Files.Length;
if (filesCountNew != filesCountOld)
{
result = false;
logger.LogInformation("<{directory}> file count has changed {filesCountNew} != {filesCountOld}", record.Directory, filesCountNew, filesCountOld);
}
else
{
double filesTotalLengthOld = collection.Select(l => l.Length).Sum();
double filesTotalLengthNew = jobNew.Files.Select(l => l.Length).Sum();
if (filesTotalLengthNew != filesTotalLengthOld)
{
result = false;
logger.LogInformation("<{directory}> file length has changed {filesTotalLengthNew} != {filesTotalLengthOld}", record.Directory, filesTotalLengthNew, filesTotalLengthOld);
}
else
{
string jsonNew = JsonSerializer.Serialize(jobNew.Files, FilesSourceGenerationContext.Default.FileArray);
string jsonOld = JsonSerializer.Serialize(collection.ToArray(), FilesSourceGenerationContext.Default.FileArray);
if (jsonNew == jsonOld)
result = true;
else
{
result = false;
if (Debugger.IsAttached)
{
WriteAllText(Path.Combine(Environment.CurrentDirectory, ".vscode", "helper", "old.json"), jsonOld);
WriteAllText(Path.Combine(Environment.CurrentDirectory, ".vscode", "helper", "new.json"), jsonNew);
}
logger.LogInformation("<{directory}> file serialized are different {filesTotalLengthNew} != {filesTotalLengthOld}", record.Directory, filesTotalLengthNew, filesTotalLengthOld);
}
}
}
return result;
}
private static void WriteAllText(string path, string text)
{
string check = !System.IO.File.Exists(path) ? string.Empty : System.IO.File.ReadAllText(path);
if (check != text)
System.IO.File.WriteAllText(path, text);
}
private static void WritePassedExtension(Record record, ReadOnlyCollection<File> files, string directoryName, string path)
{
if (record.Job.Extension.Equals(".iso", StringComparison.OrdinalIgnoreCase))
WriteISO(record, files, path, directoryName);
else if (record.Job.Extension.Equals(".zip", StringComparison.OrdinalIgnoreCase))
WriteZIP(record, files, path);
else
throw new NotImplementedException();
}
private static void MovePassedExtension(string destination, string path)
{
string checkPath = $"{destination}{path[2..]}";
string checkDirectory = Path.GetDirectoryName(checkPath) ?? throw new Exception();
if (!Directory.Exists(checkDirectory))
_ = Directory.CreateDirectory(checkDirectory);
if (System.IO.File.Exists(checkPath))
throw new NotImplementedException($"<{checkPath}> already exists!");
System.IO.File.Move(path, checkPath);
}
private static void WriteISO(Record record, ReadOnlyCollection<File> files, string path, string directoryName)
{
string checkDirectory = Path.GetDirectoryName(path) ?? throw new Exception();
if (!Directory.Exists(checkDirectory))
_ = Directory.CreateDirectory(checkDirectory);
CDBuilder builder = new() { UseJoliet = true, VolumeIdentifier = directoryName.Length < 25 ? directoryName : directoryName[..25] };
foreach (File file in files)
_ = builder.AddFile(file.RelativePath, Path.Combine(record.Directory, file.RelativePath));
builder.Build(path);
}
private static void WriteZIP(Record record, ReadOnlyCollection<File> files, string path)
{
string checkDirectory = Path.GetDirectoryName(path) ?? throw new Exception();
if (!Directory.Exists(checkDirectory))
_ = Directory.CreateDirectory(checkDirectory);
using ZipArchive zip = ZipFile.Open(path, ZipArchiveMode.Create);
string directoryEntry;
List<string> directoryEntries = [];
foreach (File file in files)
{
directoryEntry = Path.GetDirectoryName(file.RelativePath) ?? throw new Exception();
if (!directoryEntries.Contains(directoryEntry))
continue;
directoryEntries.Add(directoryEntry);
_ = zip.CreateEntry(file.RelativePath);
}
foreach (File file in files)
_ = zip.CreateEntryFromFile(Path.Combine(record.Directory, file.RelativePath), file.RelativePath);
}
}

View File

@ -0,0 +1,200 @@
using File_Folder_Helper.Models;
using Microsoft.Extensions.Logging;
#if ShellProgressBar
using ShellProgressBar;
#endif
using System.Collections.ObjectModel;
using System.Diagnostics;
using System.Text.Json;
namespace File_Folder_Helper.ADO2024.PI4;
internal static partial class Helper20241224
{
private static readonly HttpClient _HttpClient = new();
private record Record(Uri URI, string Path, DateTime LastModified);
private static ReadOnlyCollection<NginxFileSystem>? GetRecursiveCollection(string host, string page)
{
List<NginxFileSystem>? results;
Uri uri = new($"https://{host}/{page}");
string format = NginxFileSystem.GetFormat();
TimeZoneInfo timeZoneInfo = TimeZoneInfo.Local;
Task<HttpResponseMessage> taskHttpResponseMessage = _HttpClient.GetAsync(uri);
taskHttpResponseMessage.Wait();
if (!taskHttpResponseMessage.Result.IsSuccessStatusCode)
results = null;
else
{
Task<string> taskString = taskHttpResponseMessage.Result.Content.ReadAsStringAsync();
taskString.Wait();
NginxFileSystem[]? nginxFileSystems = JsonSerializer.Deserialize(taskString.Result, NginxFileSystemCollectionSourceGenerationContext.Default.NginxFileSystemArray);
if (nginxFileSystems is null)
results = null;
else
{
results = [];
NginxFileSystem nginxFileSystem;
ReadOnlyCollection<NginxFileSystem>? directory;
for (int i = 0; i < nginxFileSystems.Length; i++)
{
nginxFileSystem = NginxFileSystem.Get(format, timeZoneInfo, uri, nginxFileSystems[i]);
if (nginxFileSystem.Type == "file")
results.Add(nginxFileSystem);
else
{
directory = GetRecursiveCollection(host, $"{page}/{nginxFileSystem.Name}");
if (directory is null)
continue;
results.AddRange(directory);
}
}
}
}
return results?.AsReadOnly();
}
private static ReadOnlyCollection<NginxFileSystem>? GetCollection(string format, TimeZoneInfo timeZoneInfo, Uri uri)
{
List<NginxFileSystem>? results;
Task<HttpResponseMessage> taskHttpResponseMessage = _HttpClient.GetAsync(uri);
taskHttpResponseMessage.Wait();
if (!taskHttpResponseMessage.Result.IsSuccessStatusCode)
results = null;
else
{
Task<string> taskString = taskHttpResponseMessage.Result.Content.ReadAsStringAsync();
taskString.Wait();
if (taskString.Result.StartsWith('<'))
results = null;
else
{
NginxFileSystem[]? nginxFileSystems = JsonSerializer.Deserialize(taskString.Result, NginxFileSystemCollectionSourceGenerationContext.Default.NginxFileSystemArray);
if (nginxFileSystems is null)
results = null;
else
{
results = [];
NginxFileSystem nginxFileSystem;
for (int i = 0; i < nginxFileSystems.Length; i++)
{
nginxFileSystem = NginxFileSystem.Get(format, timeZoneInfo, uri, nginxFileSystems[i]);
results.Add(nginxFileSystem);
}
}
}
}
return results?.AsReadOnly();
}
private static Record? CompareFile(string host, ReadOnlyCollection<string> directoryNames, string compareDirectory, NginxFileSystem nginxFileSystem)
{
Record? result;
if (nginxFileSystem.LastModified is null || nginxFileSystem.Length is null)
result = null;
else
{
Uri uri = new($"https://{host}/{string.Join('/', directoryNames)}/{nginxFileSystem.Name}");
FileInfo fileInfo = new($"{compareDirectory}\\{string.Join('\\', directoryNames)}\\{nginxFileSystem.Name}");
if (!fileInfo.Exists || fileInfo.Length != nginxFileSystem.Length.Value)
result = new(uri, fileInfo.FullName, nginxFileSystem.LastModified.Value);
else
{
double totalSeconds = new TimeSpan(fileInfo.LastWriteTime.Ticks - nginxFileSystem.LastModified.Value.Ticks).TotalSeconds;
if (totalSeconds is < 2 and > -2)
result = null;
else
result = new(uri, fileInfo.FullName, nginxFileSystem.LastModified.Value);
}
}
return result;
}
private static ReadOnlyCollection<Record> CompareDirectory(string format, TimeZoneInfo timeZoneInfo, string host, ReadOnlyCollection<string> directoryNames, string compareDirectory, NginxFileSystem nginxFileSystem)
{
ReadOnlyCollection<Record> results;
List<string> collection = directoryNames.ToList();
collection.Add(nginxFileSystem.Name);
results = GetRecord(format, timeZoneInfo, host, collection.AsReadOnly(), compareDirectory);
return results;
}
private static ReadOnlyCollection<Record> GetRecord(string format, TimeZoneInfo timeZoneInfo, string host, ReadOnlyCollection<string> directoryNames, string compareDirectory)
{
List<Record> results = [];
Uri uri = new($"https://{host}/{string.Join('/', directoryNames)}");
ReadOnlyCollection<NginxFileSystem>? nginxFileSystems = GetCollection(format, timeZoneInfo, uri);
if (nginxFileSystems is not null)
{
NginxFileSystem nginxFileSystem;
ReadOnlyCollection<Record> records;
string checkDirectory = $"{compareDirectory}\\{string.Join('\\', directoryNames)}";
if (!Directory.Exists(checkDirectory))
_ = Directory.CreateDirectory(checkDirectory);
for (int i = 0; i < nginxFileSystems.Count; i++)
{
nginxFileSystem = NginxFileSystem.Get(format, timeZoneInfo, uri, nginxFileSystems[i]);
if (nginxFileSystem.Type == "file")
{
Record? record = CompareFile(host, directoryNames, compareDirectory, nginxFileSystem);
if (record is not null)
results.Add(record);
}
else
{
records = CompareDirectory(format, timeZoneInfo, host, directoryNames, compareDirectory, nginxFileSystem);
foreach (Record record in records)
results.Add(record);
}
}
}
return results.AsReadOnly();
}
private static void Download(Record record)
{
Task<HttpResponseMessage> taskHttpResponseMessage = _HttpClient.GetAsync(record.URI);
taskHttpResponseMessage.Wait();
if (taskHttpResponseMessage.Result.IsSuccessStatusCode)
{
Task<string> taskString = taskHttpResponseMessage.Result.Content.ReadAsStringAsync();
taskString.Wait();
File.WriteAllText(record.Path, taskString.Result);
File.SetLastWriteTime(record.Path, record.LastModified);
}
}
internal static void Compare(ILogger<Worker> logger, List<string> args)
{
string host = args[2];
string rootDirectoryName = args[3];
string format = NginxFileSystem.GetFormat();
TimeZoneInfo timeZoneInfo = TimeZoneInfo.Local;
string compareDirectory = Path.GetFullPath(args[0]);
logger.LogInformation("Comparing files on {host}", host);
ReadOnlyCollection<Record> records = GetRecord(format, timeZoneInfo, host, new([rootDirectoryName]), compareDirectory);
#if ShellProgressBar
ProgressBar progressBar = new(records.Count, "Downloading", new ProgressBarOptions() { ProgressCharacter = '─', ProgressBarOnBottom = true, DisableBottomPercentage = true });
#endif
foreach (Record record in records)
{
#if ShellProgressBar
progressBar.Tick();
#endif
Download(record);
}
#if ShellProgressBar
progressBar.Dispose();
#endif
if (Debugger.IsAttached)
{
ReadOnlyCollection<NginxFileSystem>? recursiveCollection = GetRecursiveCollection(host, rootDirectoryName);
string? json = recursiveCollection is null ? null : JsonSerializer.Serialize(recursiveCollection.ToArray(), NginxFileSystemCollectionSourceGenerationContext.Default.NginxFileSystemArray);
if (!string.IsNullOrEmpty(json))
File.WriteAllText(Path.Combine(Environment.CurrentDirectory, ".vscode", "helper", ".json"), json);
}
}
}

View File

@ -0,0 +1,61 @@
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
namespace File_Folder_Helper.ADO2025.PI4;
internal static partial class Helper20250101
{
private static ReadOnlyDictionary<string, List<FileInfo>> GetKeyValuePairs(string directory, string searchPattern, string split)
{
string key;
List<FileInfo>? collection;
Dictionary<string, List<FileInfo>> results = [];
string[] files = Directory.GetFiles(directory, searchPattern, SearchOption.TopDirectoryOnly);
FileInfo[] fileInfoCollection = files.Select(l => new FileInfo(l)).ToArray();
foreach (FileInfo fileInfo in fileInfoCollection.OrderBy(l => l.LastWriteTime))
{
key = fileInfo.Name.Split(split)[0];
if (!results.TryGetValue(key, out collection))
{
results.Add(key, []);
if (!results.TryGetValue(key, out collection))
throw new Exception();
}
collection.Add(fileInfo);
}
return results.AsReadOnly();
}
private static void MoveToDelete(ILogger<Worker> logger, string appendage, ReadOnlyDictionary<string, List<FileInfo>> keyValuePairs)
{
string checkFile;
FileInfo fileInfo;
foreach (KeyValuePair<string, List<FileInfo>> keyValuePair in keyValuePairs)
{
if (keyValuePair.Value.Count < 3)
continue;
for (int i = 1; i < keyValuePair.Value.Count - 1; i++)
{
fileInfo = keyValuePair.Value[i];
checkFile = Path.Combine($"{fileInfo.Directory}{appendage}", fileInfo.Name);
if (File.Exists(checkFile))
continue;
logger.LogInformation("Moving <{fileInfo.FullName}> to <{checkFile}>", fileInfo.FullName, checkFile);
File.Move(fileInfo.FullName, checkFile);
}
}
}
internal static void MoveToDelete(ILogger<Worker> logger, List<string> args)
{
string split = args[3];
string appendage = args[4];
string searchPattern = args[2];
string compareDirectory = Path.GetFullPath(args[0]);
ReadOnlyDictionary<string, List<FileInfo>> keyValuePairs = GetKeyValuePairs(compareDirectory, searchPattern, split);
logger.LogInformation("KeyValuePairs: {keyValuePairs}", keyValuePairs.Count);
MoveToDelete(logger, appendage, keyValuePairs);
}
}

View File

@ -0,0 +1,54 @@
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI4;
internal static partial class Helper20250114
{
private static void Rename(string[] directories, string dateFormat)
{
string[] files;
DateTime dateTime;
FileInfo fileInfo;
string checkDirectory;
foreach (string directory in directories)
{
dateTime = DateTime.MinValue;
files = Directory.GetFiles(directory, "*", SearchOption.AllDirectories);
foreach (string file in files)
{
fileInfo = new(file);
if (dateTime > fileInfo.LastWriteTime)
continue;
dateTime = fileInfo.LastWriteTime;
}
if (dateTime == DateTime.MinValue)
continue;
checkDirectory = Path.Combine(Path.GetDirectoryName(directory) ?? throw new Exception(), dateTime.ToString(dateFormat));
if (checkDirectory != directory)
{
if (Directory.Exists(checkDirectory))
continue;
Directory.Move(directory, checkDirectory);
}
Directory.SetLastWriteTime(checkDirectory, dateTime);
}
}
private static void Rename(ILogger<Worker> logger, string sourceDirectory, string searchPattern, string dateFormat)
{
string[] directories = Directory.GetDirectories(sourceDirectory, searchPattern);
logger.LogInformation("directories: {directories}", directories.Length);
Rename(directories, dateFormat);
}
internal static void Rename(ILogger<Worker> logger, List<string> args)
{
string dateFormat = args[3];
string[] searchPatterns = args[2].Split('~');
string sourceDirectory = Path.GetFullPath(args[0]);
foreach (string searchPattern in searchPatterns)
Rename(logger, sourceDirectory, searchPattern, dateFormat);
}
}

View File

@ -0,0 +1,93 @@
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
using System.Globalization;
namespace File_Folder_Helper.ADO2025.PI4;
internal static partial class Helper20250126
{
private static void Move(string file, string fileName, string checkFile, List<string> foundLines, ReadOnlyCollection<DateTime> dateTimes)
{
string checkDirectory = Path.Combine(Path.GetDirectoryName(file) ?? throw new Exception(), dateTimes[0].ToString("yyyy-MM"));
if (!Directory.Exists(checkDirectory))
_ = Directory.CreateDirectory(checkDirectory);
string fileNameB = Path.GetFileName(checkFile);
string checkFileB = Path.Combine(checkDirectory, fileName);
string checkFileC = Path.Combine(checkDirectory, fileNameB);
string contents = string.Join(Environment.NewLine, foundLines);
string checkFileD = Path.Combine(checkDirectory, $"{fileName}.txt");
if (!File.Exists(checkFileB))
File.Move(file, checkFileB);
if (!File.Exists(checkFileC))
File.Move(checkFile, checkFileC);
File.WriteAllText(checkFileD, contents);
}
private static void Move(ILogger<Worker> logger, string dateFormat, string file, string checkFile, string fileName, ReadOnlyCollection<string> statementPeriodSegments, List<string> foundLines)
{
DateTime dateTime;
List<DateTime> dateTimes = [];
foreach (string check in statementPeriodSegments)
{
if (!DateTime.TryParseExact(check, dateFormat, CultureInfo.InvariantCulture, DateTimeStyles.None, out dateTime))
continue;
dateTimes.Add(dateTime);
}
if (dateTimes.Count != 2)
logger.LogInformation($"Only {dateTimes.Count} date(s) were found in <{fileName}>!");
else
Move(file, fileName, checkFile, foundLines, dateTimes.AsReadOnly());
}
private static void Move(ILogger<Worker> logger, string file, string checkFile, string dateFormat, string statementPeriod, string search)
{
List<string> foundLines = [];
bool statementPeriodFound = false;
string[]? statementPeriodSegments = null;
string fileName = Path.GetFileName(file);
string[] lines = File.ReadAllLines(file);
foreach (string line in lines)
{
if (statementPeriodSegments is not null)
{
if (line.Contains(search))
foundLines.Add(line);
}
else
{
if (statementPeriodFound)
{
statementPeriodSegments = line.Split(' ');
continue;
}
if (!line.Contains(statementPeriod))
continue;
statementPeriodFound = true;
}
}
if (statementPeriodSegments is null || statementPeriodSegments.Length < 4)
logger.LogInformation($"{nameof(statementPeriod)}: {statementPeriod}; wasn't found in <{fileName}>!");
else
Move(logger, dateFormat, file, checkFile, fileName, statementPeriodSegments.AsReadOnly(), foundLines);
}
internal static void Move(ILogger<Worker> logger, List<string> args)
{
string checkFile;
string search = args[5];
string dateFormat = args[3];
string searchPatterns = args[2];
string statementPeriod = args[4];
string sourceDirectory = Path.GetFullPath(args[0]);
string[] files = Directory.GetFiles(sourceDirectory, searchPatterns, SearchOption.AllDirectories);
foreach (string file in files)
{
checkFile = Path.ChangeExtension(file, ".pdf");
if (!File.Exists(checkFile))
continue;
Move(logger, file, checkFile, dateFormat, statementPeriod, search);
}
}
}

View File

@ -0,0 +1,311 @@
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
using System.Globalization;
using System.Text;
using System.Text.Json;
using System.Text.Json.Serialization;
using System.Text.RegularExpressions;
namespace File_Folder_Helper.ADO2025.PI4;
internal static partial class Helper20250204
{
[GeneratedRegex("([A-Z]+(.))")]
private static partial Regex UpperCase();
[GeneratedRegex("[\\s!?.,@:;|\\\\/\"'`£$%\\^&*{}[\\]()<>~#+\\-=_¬]+")]
private static partial Regex InvalidCharacter();
private record H1ParamCaseAndState(string H1, string ParamCase, string State)
{
private static string GetParamCase(string value)
{
string result;
StringBuilder stringBuilder = new(value);
Match[] matches = UpperCase().Matches(value).ToArray();
for (int i = matches.Length - 1; i > -1; i--)
_ = stringBuilder.Insert(matches[i].Index, '-');
string[] segments = InvalidCharacter().Split(stringBuilder.ToString().ToLower());
result = string.Join('-', segments).Trim('-');
return result;
}
private static string GetState(string value) =>
value switch
{
"New" => "ToDo",
"Active" => "In Progress",
"Closed" => "Done",
_ => "Backlog",
};
internal static H1ParamCaseAndState Get(WorkItem workItem)
{
H1ParamCaseAndState result;
string paramCase = GetParamCase(workItem.Title);
string state = GetState(workItem.State);
result = new(workItem.Title, paramCase, state);
return result;
}
}
private record Attribute([property: JsonPropertyName("isLocked")] bool IsLocked,
[property: JsonPropertyName("name")] string Name,
[property: JsonPropertyName("parameterTitle")] string? ParameterTitle,
[property: JsonPropertyName("state")] string? State,
[property: JsonPropertyName("workItemType")] string? WorkItemType);
private record Relation([property: JsonPropertyName("attributes")] Attribute Attributes,
[property: JsonPropertyName("id")] int Id,
[property: JsonPropertyName("rel")] string Rel);
private record WorkItem(DateTime? ActivatedDate,
string AreaPath,
string? AssignedTo,
long? BusinessValue,
DateTime ChangedDate,
DateTime? ClosedDate,
int CommentCount,
DateTime CreatedDate,
string Description,
long? Effort,
int Id,
string IterationPath,
int? Parent,
int? Priority,
Relation[]? Relations,
long? RemainingWork,
string? Requester,
DateTime? ResolvedDate,
int Revision,
long? RiskReductionMinusOpportunityEnablement,
DateTime? StartDate,
string State,
long? StoryPoints,
string Tags,
DateTime? TargetDate,
long? TimeCriticality,
string Title,
string? Violation,
long? WeightedShortestJobFirst,
string WorkItemType)
{
public override string ToString() => $"{Id} - {WorkItemType} - {Title}";
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(WorkItem))]
private partial class WorkItemSourceGenerationContext : JsonSerializerContext
{
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(WorkItem[]))]
private partial class WorkItemCollectionSourceGenerationContext : JsonSerializerContext
{
}
private static string[] GetTaskLines(string directory, string rootDirectory) =>
[
"{",
"\"version\": \"2.0.0\",",
"\"tasks\": [",
"{",
"\"label\": \"File-Folder-Helper AOT s X Day-Helper-2025-02-04\",",
"\"type\": \"shell\",",
"\"command\": \"L:/DevOps/Mesa_FI/File-Folder-Helper/bin/Release/net8.0/win-x64/publish/File-Folder-Helper.exe\",",
"\"args\": [",
"\"s\",",
"\"X\",",
$"\"{directory}\",",
"\"Day-Helper-2025-02-04\",",
$"\"{rootDirectory}\",",
"],",
"\"problemMatcher\": []",
"},",
"{",
"\"label\": \"File-Folder-Helper AOT s X Day-Helper-2024-06-23\",",
"\"type\": \"shell\",",
"\"command\": \"L:/DevOps/Mesa_FI/File-Folder-Helper/bin/Release/net8.0/win-x64/publish/File-Folder-Helper.exe\",",
"\"args\": [",
"\"s\",",
"\"X\",",
$"\"{directory}\",",
"\"Day-Helper-2024-06-23\",",
"\"*.md\",",
"\"##_Sub-tasks\",",
"\"-_[code-insiders](\",",
"\"index.md\",",
"\"-_[,](\",",
"\"##_Done\",",
"\".kan\",",
$"\"{rootDirectory}\",",
"\"316940400000\",",
"],",
"\"problemMatcher\": []",
"}",
"]",
"}",
];
private static string GetTaskText(string directory, string rootDirectory) =>
string.Join(Environment.NewLine, GetTaskLines(directory, rootDirectory));
private static void WriteTaskFile(string sourceDirectory, string rootDirectory)
{
string tasksFile = Path.Combine(sourceDirectory, ".vscode", "tasks.json");
string oldText = File.ReadAllText(tasksFile);
string jsonSafeDirectory = sourceDirectory.Replace('\\', '/');
if (!oldText.Contains(jsonSafeDirectory))
{
string text = GetTaskText(jsonSafeDirectory, rootDirectory);
File.WriteAllText(tasksFile, text);
}
}
private static string GetFilter(ReadOnlyCollection<H1ParamCaseAndState> collection, string filter) =>
string.Join(Environment.NewLine, from l in collection where l.State == filter select $"- [{l.ParamCase}](tasks/{l.ParamCase}.md)");
private static string[] GetIndexLines(WorkItem workItem, H1ParamCaseAndState h1ParamCaseAndState, ReadOnlyCollection<H1ParamCaseAndState> collection) =>
[
"---",
"startedColumns:",
" - 'In Progress'",
"completedColumns:",
" - Done",
"---",
string.Empty,
$"# {workItem.Id} - {h1ParamCaseAndState.H1}",
string.Empty,
"## Backlog",
string.Empty,
GetFilter(collection, "Backlog"),
string.Empty,
"## Todo",
string.Empty,
GetFilter(collection, "ToDo"),
string.Empty,
"## In Progress",
string.Empty,
GetFilter(collection, "In Progress"),
string.Empty,
"## Done",
string.Empty,
GetFilter(collection, "Done"),
string.Empty
];
private static string GetIndexText(WorkItem workItem, H1ParamCaseAndState h1ParamCaseAndState, ReadOnlyCollection<H1ParamCaseAndState> collection) =>
string.Join(Environment.NewLine, GetIndexLines(workItem, h1ParamCaseAndState, collection));
private static string GetIndexMarkdown(FileInfo fileInfo, ReadOnlyCollection<WorkItem> workItems)
{
string result;
H1ParamCaseAndState h1ParamCaseAndState;
List<H1ParamCaseAndState> collection = [];
foreach (WorkItem w in workItems)
{
h1ParamCaseAndState = H1ParamCaseAndState.Get(w);
collection.Add(h1ParamCaseAndState);
}
string line = Environment.NewLine;
string json = File.ReadAllText(fileInfo.FullName);
WorkItem? workItem = JsonSerializer.Deserialize(json, WorkItemSourceGenerationContext.Default.WorkItem) ??
throw new NullReferenceException(nameof(WorkItem));
h1ParamCaseAndState = H1ParamCaseAndState.Get(workItem);
string text = GetIndexText(workItem, h1ParamCaseAndState, collection.AsReadOnly());
result = text.Replace($"{line}{line}{line}{line}", $"{line}{line}").Replace("408m](tasks", "408M](tasks");
return result;
}
private static ReadOnlyCollection<WorkItem> GetWorkItems(string[] files)
{
List<WorkItem> results = [];
string json;
WorkItem? workItem;
foreach (string file in files)
{
json = File.ReadAllText(file);
workItem = JsonSerializer.Deserialize(json, WorkItemSourceGenerationContext.Default.WorkItem);
if (workItem is null)
continue;
results.Add(workItem);
}
return results.AsReadOnly();
}
private static void ExtractKanban(string searchPattern, string rootDirectory, DirectoryInfo kanbanDirectory, FileInfo fileInfo)
{
string checkFile;
string weekOfYear;
string workItemDirectory;
string line = Environment.NewLine;
H1ParamCaseAndState h1ParamCaseAndState;
Calendar calendar = new CultureInfo("en-US").Calendar;
string tasksDirectory = Path.Combine(kanbanDirectory.FullName, "tasks");
if (!Directory.Exists(tasksDirectory))
_ = Directory.CreateDirectory(tasksDirectory);
string[] files = Directory.GetFiles(tasksDirectory, searchPattern, SearchOption.TopDirectoryOnly);
ReadOnlyCollection<WorkItem> workItems = GetWorkItems(files);
string markdown = GetIndexMarkdown(fileInfo, workItems);
string indexFile = Path.Combine(kanbanDirectory.FullName, "index.md");
string markdownOld = File.Exists(indexFile) ? File.ReadAllText(indexFile) : string.Empty;
if (markdown != markdownOld)
File.WriteAllText(indexFile, markdown);
foreach (WorkItem workItem in workItems)
{
h1ParamCaseAndState = H1ParamCaseAndState.Get(workItem);
checkFile = Path.Combine(tasksDirectory, $"{h1ParamCaseAndState.ParamCase}.md");
markdownOld = File.Exists(checkFile) ? File.ReadAllText(checkFile) : string.Empty;
if (markdownOld.Contains("]("))
continue;
weekOfYear = calendar.GetWeekOfYear(workItem.CreatedDate, CalendarWeekRule.FirstDay, DayOfWeek.Sunday).ToString("00");
workItemDirectory = Path.GetFullPath(Path.Combine(rootDirectory, $"{workItem.CreatedDate:yyyy}", $"{workItem.CreatedDate:yyyy}_Week_{weekOfYear}", $"{workItem.Id}"));
markdown = $"# {h1ParamCaseAndState.H1}{line}{line}## Id {workItem.Id}{line}{line}## Code Insiders{line}{line}- [code-insiders]({workItemDirectory}){line}";
if (markdown != markdownOld)
File.WriteAllText(checkFile, markdown);
}
}
private static string GetSourceDirectory(string directory)
{
string? result = null;
DirectoryInfo directoryInfo;
string? checkDirectory = directory;
string? pathRoot = Path.GetPathRoot(directory);
for (int i = 0; i < int.MaxValue; i++)
{
checkDirectory = Path.GetDirectoryName(checkDirectory);
if (string.IsNullOrEmpty(checkDirectory) || checkDirectory == pathRoot)
break;
directoryInfo = new(checkDirectory);
if (string.IsNullOrEmpty(directoryInfo.LinkTarget))
continue;
result = directory.Replace(checkDirectory, directoryInfo.LinkTarget);
break;
}
result ??= directory;
return result;
}
internal static void ExtractKanban(ILogger<Worker> logger, List<string> args)
{
string searchPattern = "*.json";
string fullPath = Path.GetFullPath(args[0]);
string sourceDirectory = GetSourceDirectory(fullPath);
string rootDirectory = args.Count < 3 || args[2].Length < 16 ? "D:/5-Other-Small/Kanban-mestsa003/{}" : args[2];
WriteTaskFile(sourceDirectory, rootDirectory);
string sourceDirectoryName = Path.GetFileName(sourceDirectory);
DirectoryInfo directoryInfo = new(Path.Combine(sourceDirectory, ".kanbn"));
FileInfo? fileInfo = !directoryInfo.Exists ? null : new(Path.Combine(directoryInfo.FullName, $"{sourceDirectoryName}.json"));
if (directoryInfo.Exists && fileInfo is not null && fileInfo.Exists)
ExtractKanban(searchPattern, rootDirectory, directoryInfo, fileInfo);
else
logger.LogWarning("<{directoryInfo}> doesn't exist", directoryInfo.FullName);
}
}

381
ADO2025/PI5/.editorconfig Normal file
View File

@ -0,0 +1,381 @@
[*.md]
end_of_line = crlf
file_header_template = unset
indent_size = 2
indent_style = space
insert_final_newline = false
root = true
tab_width = 2
[*.csproj]
end_of_line = crlf
file_header_template = unset
indent_size = 2
indent_style = space
insert_final_newline = false
root = true
tab_width = 2
[*.cs]
csharp_indent_block_contents = true
csharp_indent_braces = false
csharp_indent_case_contents = true
csharp_indent_case_contents_when_block = true
csharp_indent_labels = one_less_than_current
csharp_indent_switch_labels = true
csharp_new_line_before_catch = false
csharp_new_line_before_else = false
csharp_new_line_before_finally = false
csharp_new_line_before_members_in_anonymous_types = true
csharp_new_line_before_members_in_object_initializers = true
csharp_new_line_before_open_brace = none
csharp_new_line_between_query_expression_clauses = true
csharp_prefer_braces = false
csharp_prefer_qualified_reference = true:error
csharp_prefer_simple_default_expression = true:warning
csharp_prefer_simple_using_statement = true:warning
csharp_prefer_static_local_function = true:warning
csharp_preferred_modifier_order = public,private,protected,internal,static,extern,new,virtual,abstract,sealed,override,readonly,unsafe,volatile,async
csharp_preserve_single_line_blocks = true
csharp_preserve_single_line_statements = false
csharp_space_after_cast = false
csharp_space_after_colon_in_inheritance_clause = true
csharp_space_after_comma = true
csharp_space_after_dot = false
csharp_space_after_keywords_in_control_flow_statements = true
csharp_space_after_semicolon_in_for_statement = true
csharp_space_around_binary_operators = before_and_after
csharp_space_around_declaration_statements = false
csharp_space_before_colon_in_inheritance_clause = true
csharp_space_before_comma = false
csharp_space_before_dot = false
csharp_space_before_open_square_brackets = false
csharp_space_before_semicolon_in_for_statement = false
csharp_space_between_empty_square_brackets = false
csharp_space_between_method_call_empty_parameter_list_parentheses = false
csharp_space_between_method_call_name_and_opening_parenthesis = false
csharp_space_between_method_call_parameter_list_parentheses = false
csharp_space_between_method_declaration_empty_parameter_list_parentheses = false
csharp_space_between_method_declaration_name_and_open_parenthesis = false
csharp_space_between_method_declaration_parameter_list_parentheses = false
csharp_space_between_parentheses = false
csharp_space_between_square_brackets = false
csharp_style_allow_blank_line_after_colon_in_constructor_initializer_experimental = true
csharp_style_allow_blank_line_after_token_in_arrow_expression_clause_experimental = true
csharp_style_allow_blank_line_after_token_in_conditional_expression_experimental = true
csharp_style_allow_blank_lines_between_consecutive_braces_experimental = false
csharp_style_allow_blank_lines_between_consecutive_braces_experimental = true
csharp_style_allow_embedded_statements_on_same_line_experimental = true
csharp_style_conditional_delegate_call = true
csharp_style_deconstructed_variable_declaration = false
csharp_style_expression_bodied_accessors = when_on_single_line:warning
csharp_style_expression_bodied_constructors = when_on_single_line:warning
csharp_style_expression_bodied_indexers = when_on_single_line:warning
csharp_style_expression_bodied_lambdas = when_on_single_line:warning
csharp_style_expression_bodied_local_functions = when_on_single_line:warning
csharp_style_expression_bodied_methods = when_on_single_line:warning
csharp_style_expression_bodied_operators = when_on_single_line:warning
csharp_style_expression_bodied_properties = when_on_single_line:warning
csharp_style_implicit_object_creation_when_type_is_apparent = true:warning
csharp_style_inlined_variable_declaration = false
csharp_style_namespace_declarations = file_scoped:warning
csharp_style_pattern_local_over_anonymous_function = true:warning
csharp_style_pattern_matching_over_as_with_null_check = true:warning
csharp_style_pattern_matching_over_is_with_cast_check = true:warning
csharp_style_prefer_index_operator = true:warning
csharp_style_prefer_not_pattern = true:warning
csharp_style_prefer_null_check_over_type_check = true
csharp_style_prefer_pattern_matching = true:warning
csharp_style_prefer_range_operator = true:warning
csharp_style_prefer_switch_expression = true:warning
csharp_style_throw_expression = true
csharp_style_unused_value_assignment_preference = discard_variable:warning
csharp_style_unused_value_expression_statement_preference = discard_variable:warning
csharp_style_var_elsewhere = false:warning
csharp_style_var_for_built_in_types = false:warning
csharp_style_var_when_type_is_apparent = false:warning
csharp_using_directive_placement = outside_namespace
dotnet_analyzer_diagnostic.category-Design.severity = error
dotnet_analyzer_diagnostic.category-Documentation.severity = error
dotnet_analyzer_diagnostic.category-Globalization.severity = none
dotnet_analyzer_diagnostic.category-Interoperability.severity = error
dotnet_analyzer_diagnostic.category-Maintainability.severity = error
dotnet_analyzer_diagnostic.category-Naming.severity = none
dotnet_analyzer_diagnostic.category-Performance.severity = none
dotnet_analyzer_diagnostic.category-Reliability.severity = error
dotnet_analyzer_diagnostic.category-Security.severity = error
dotnet_analyzer_diagnostic.category-SingleFile.severity = error
dotnet_analyzer_diagnostic.category-Style.severity = error
dotnet_analyzer_diagnostic.category-Usage.severity = error
dotnet_code_quality_unused_parameters = all
dotnet_code_quality_unused_parameters = non_public
dotnet_code_quality.CAXXXX.api_surface = private, internal
dotnet_diagnostic.CA1001.severity = error # CA1001: Types that own disposable fields should be disposable
dotnet_diagnostic.CA1051.severity = error # CA1051: Do not declare visible instance fields
dotnet_diagnostic.CA1511.severity = warning # CA1511: Use 'ArgumentException.ThrowIfNullOrEmpty' instead of explicitly throwing a new exception instance
dotnet_diagnostic.CA1513.severity = warning # Use 'ObjectDisposedException.ThrowIf' instead of explicitly throwing a new exception instance
dotnet_diagnostic.CA1825.severity = warning # CA1825: Avoid zero-length array allocations
dotnet_diagnostic.CA1829.severity = error # CA1829: Use Length/Count property instead of Count() when available
dotnet_diagnostic.CA1834.severity = warning # CA1834: Consider using 'StringBuilder.Append(char)' when applicable
dotnet_diagnostic.CA1860.severity = error # CA1860: Prefer comparing 'Count' to 0 rather than using 'Any()', both for clarity and for performance
dotnet_diagnostic.CA1862.severity = warning # CA1862: Prefer using 'string.Equals(string, StringComparison)' to perform a case-insensitive comparison, but keep in mind that this might cause subtle changes in behavior, so make sure to conduct thorough testing after applying the suggestion, or if culturally sensitive comparison is not required, consider using 'StringComparison.OrdinalIgnoreCase'
dotnet_diagnostic.CA1869.severity = none # CA1869: Avoid creating a new 'JsonSerializerOptions' instance for every serialization operation. Cache and reuse instances instead.
dotnet_diagnostic.CA2201.severity = none # CA2201: Exception type System.NullReferenceException is reserved by the runtime
dotnet_diagnostic.CA2254.severity = none # CA2254: The logging message template should not vary between calls to 'LoggerExtensions.LogInformation(ILogger, string?, params object?[])'
dotnet_diagnostic.IDE0001.severity = warning # IDE0001: Simplify name
dotnet_diagnostic.IDE0002.severity = warning # Simplify (member access) - System.Version.Equals("1", "2"); Version.Equals("1", "2");
dotnet_diagnostic.IDE0004.severity = warning # IDE0004: Cast is redundant.
dotnet_diagnostic.IDE0005.severity = error # Using directive is unnecessary
dotnet_diagnostic.IDE0010.severity = none # Add missing cases to switch statement (IDE0010)
dotnet_diagnostic.IDE0028.severity = error # IDE0028: Collection initialization can be simplified
dotnet_diagnostic.IDE0031.severity = warning # Use null propagation (IDE0031)
dotnet_diagnostic.IDE0047.severity = warning # IDE0047: Parentheses can be removed
dotnet_diagnostic.IDE0048.severity = none # Parentheses preferences (IDE0047 and IDE0048)
dotnet_diagnostic.IDE0049.severity = warning # Use language keywords instead of framework type names for type references (IDE0049)
dotnet_diagnostic.IDE0051.severity = error # Private member '' is unused [, ]
dotnet_diagnostic.IDE0058.severity = error # IDE0058: Expression value is never used
dotnet_diagnostic.IDE0060.severity = error # IDE0060: Remove unused parameter
dotnet_diagnostic.IDE0074.severity = warning # IDE0074: Use compound assignment
dotnet_diagnostic.IDE0130.severity = none # Namespace does not match folder structure (IDE0130)
dotnet_diagnostic.IDE0270.severity = warning # IDE0270: Null check can be simplified
dotnet_diagnostic.IDE0290.severity = none # Use primary constructor [Distance]csharp(IDE0290)
dotnet_diagnostic.IDE0300.severity = error # IDE0300: Collection initialization can be simplified
dotnet_diagnostic.IDE0301.severity = error #IDE0301: Collection initialization can be simplified
dotnet_diagnostic.IDE0305.severity = none # IDE0305: Collection initialization can be simplified
dotnet_diagnostic.IDE2000.severity = error # IDE2000: Allow multiple blank lines
dotnet_naming_rule.abstract_method_should_be_pascal_case.severity = warning
dotnet_naming_rule.abstract_method_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.abstract_method_should_be_pascal_case.symbols = abstract_method
dotnet_naming_rule.class_should_be_pascal_case.severity = warning
dotnet_naming_rule.class_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.class_should_be_pascal_case.symbols = class
dotnet_naming_rule.delegate_should_be_pascal_case.severity = warning
dotnet_naming_rule.delegate_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.delegate_should_be_pascal_case.symbols = delegate
dotnet_naming_rule.enum_should_be_pascal_case.severity = warning
dotnet_naming_rule.enum_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.enum_should_be_pascal_case.symbols = enum
dotnet_naming_rule.event_should_be_pascal_case.severity = warning
dotnet_naming_rule.event_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.event_should_be_pascal_case.symbols = event
dotnet_naming_rule.interface_should_be_begins_with_i.severity = warning
dotnet_naming_rule.interface_should_be_begins_with_i.style = begins_with_i
dotnet_naming_rule.interface_should_be_begins_with_i.symbols = interface
dotnet_naming_rule.method_should_be_pascal_case.severity = warning
dotnet_naming_rule.method_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.method_should_be_pascal_case.symbols = method
dotnet_naming_rule.non_field_members_should_be_pascal_case.severity = warning
dotnet_naming_rule.non_field_members_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.non_field_members_should_be_pascal_case.symbols = non_field_members
dotnet_naming_rule.private_method_should_be_pascal_case.severity = warning
dotnet_naming_rule.private_method_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.private_method_should_be_pascal_case.symbols = private_method
dotnet_naming_rule.private_or_internal_field_should_be_private_of_internal_field.severity = warning
dotnet_naming_rule.private_or_internal_field_should_be_private_of_internal_field.style = private_of_internal_field
dotnet_naming_rule.private_or_internal_field_should_be_private_of_internal_field.symbols = private_or_internal_field
dotnet_naming_rule.private_or_internal_static_field_should_be_private_of_internal_field.severity = warning
dotnet_naming_rule.private_or_internal_static_field_should_be_private_of_internal_field.style = private_of_internal_field
dotnet_naming_rule.private_or_internal_static_field_should_be_private_of_internal_field.symbols = private_or_internal_static_field
dotnet_naming_rule.property_should_be_pascal_case.severity = warning
dotnet_naming_rule.property_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.property_should_be_pascal_case.symbols = property
dotnet_naming_rule.public_or_protected_field_should_be_private_of_internal_field.severity = warning
dotnet_naming_rule.public_or_protected_field_should_be_private_of_internal_field.style = private_of_internal_field
dotnet_naming_rule.public_or_protected_field_should_be_private_of_internal_field.symbols = public_or_protected_field
dotnet_naming_rule.static_field_should_be_pascal_case.severity = warning
dotnet_naming_rule.static_field_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.static_field_should_be_pascal_case.symbols = static_field
dotnet_naming_rule.static_method_should_be_pascal_case.severity = warning
dotnet_naming_rule.static_method_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.static_method_should_be_pascal_case.symbols = static_method
dotnet_naming_rule.struct_should_be_pascal_case.severity = warning
dotnet_naming_rule.struct_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.struct_should_be_pascal_case.symbols = struct
dotnet_naming_rule.types_should_be_pascal_case.severity = warning
dotnet_naming_rule.types_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.types_should_be_pascal_case.symbols = types
dotnet_naming_style.begins_with_i.capitalization = pascal_case
dotnet_naming_style.begins_with_i.required_prefix = I
dotnet_naming_style.begins_with_i.required_suffix =
dotnet_naming_style.begins_with_i.word_separator =
dotnet_naming_style.pascal_case.capitalization = pascal_case
dotnet_naming_style.pascal_case.required_prefix =
dotnet_naming_style.pascal_case.required_suffix =
dotnet_naming_style.pascal_case.word_separator =
dotnet_naming_style.private_of_internal_field.capitalization = pascal_case
dotnet_naming_style.private_of_internal_field.required_prefix = _
dotnet_naming_style.private_of_internal_field.required_suffix =
dotnet_naming_style.private_of_internal_field.word_separator =
dotnet_naming_symbols.abstract_method.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.abstract_method.applicable_kinds = method
dotnet_naming_symbols.abstract_method.required_modifiers = abstract
dotnet_naming_symbols.class.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.class.applicable_kinds = class
dotnet_naming_symbols.class.required_modifiers =
dotnet_naming_symbols.delegate.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.delegate.applicable_kinds = delegate
dotnet_naming_symbols.delegate.required_modifiers =
dotnet_naming_symbols.enum.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.enum.applicable_kinds = enum
dotnet_naming_symbols.enum.required_modifiers =
dotnet_naming_symbols.event.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.event.applicable_kinds = event
dotnet_naming_symbols.event.required_modifiers =
dotnet_naming_symbols.interface.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.interface.applicable_kinds = interface
dotnet_naming_symbols.interface.required_modifiers =
dotnet_naming_symbols.method.applicable_accessibilities = public
dotnet_naming_symbols.method.applicable_kinds = method
dotnet_naming_symbols.method.required_modifiers =
dotnet_naming_symbols.non_field_members.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.non_field_members.applicable_kinds = property, event, method
dotnet_naming_symbols.non_field_members.required_modifiers =
dotnet_naming_symbols.private_method.applicable_accessibilities = private
dotnet_naming_symbols.private_method.applicable_kinds = method
dotnet_naming_symbols.private_method.required_modifiers =
dotnet_naming_symbols.private_or_internal_field.applicable_accessibilities = internal, private, private_protected
dotnet_naming_symbols.private_or_internal_field.applicable_kinds = field
dotnet_naming_symbols.private_or_internal_field.required_modifiers =
dotnet_naming_symbols.private_or_internal_static_field.applicable_accessibilities = internal, private, private_protected
dotnet_naming_symbols.private_or_internal_static_field.applicable_kinds = field
dotnet_naming_symbols.private_or_internal_static_field.required_modifiers = static
dotnet_naming_symbols.property.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.property.applicable_kinds = property
dotnet_naming_symbols.property.required_modifiers =
dotnet_naming_symbols.public_or_protected_field.applicable_accessibilities = public, protected
dotnet_naming_symbols.public_or_protected_field.applicable_kinds = field
dotnet_naming_symbols.public_or_protected_field.required_modifiers =
dotnet_naming_symbols.static_field.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.static_field.applicable_kinds = field
dotnet_naming_symbols.static_field.required_modifiers = static
dotnet_naming_symbols.static_method.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.static_method.applicable_kinds = method
dotnet_naming_symbols.static_method.required_modifiers = static
dotnet_naming_symbols.struct.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.struct.applicable_kinds = struct
dotnet_naming_symbols.struct.required_modifiers =
dotnet_naming_symbols.types.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.types.applicable_kinds = class, struct, interface, enum
dotnet_naming_symbols.types.required_modifiers =
dotnet_remove_unnecessary_suppression_exclusions = 0
dotnet_separate_import_directive_groups = true
dotnet_sort_system_directives_first = true
dotnet_style_allow_multiple_blank_lines_experimental = false:warning
dotnet_style_allow_statement_immediately_after_block_experimental = true
dotnet_style_coalesce_expression = true
dotnet_style_collection_initializer = true:warning
dotnet_style_explicit_tuple_names = true:warning
dotnet_style_namespace_match_folder = true
dotnet_style_null_propagation = true:warning
dotnet_style_object_initializer = true:warning
dotnet_style_operator_placement_when_wrapping = beginning_of_line
dotnet_style_parentheses_in_arithmetic_binary_operators = always_for_clarity
dotnet_style_parentheses_in_other_binary_operators = always_for_clarity
dotnet_style_parentheses_in_other_operators = never_if_unnecessary
dotnet_style_parentheses_in_relational_binary_operators = always_for_clarity
dotnet_style_predefined_type_for_locals_parameters_members = true
dotnet_style_predefined_type_for_member_access = true:warning
dotnet_style_prefer_auto_properties = true:warning
dotnet_style_prefer_compound_assignment = true:warning
dotnet_style_prefer_conditional_expression_over_assignment = false
dotnet_style_prefer_conditional_expression_over_return = false
dotnet_style_prefer_inferred_anonymous_type_member_names = true:warning
dotnet_style_prefer_inferred_tuple_names = true:warning
dotnet_style_prefer_is_null_check_over_reference_equality_method = true:warning
dotnet_style_prefer_simplified_boolean_expressions = true:warning
dotnet_style_prefer_simplified_interpolation = true
dotnet_style_qualification_for_event = false:error
dotnet_style_qualification_for_field = false
dotnet_style_qualification_for_method = false:error
dotnet_style_qualification_for_property = false:error
dotnet_style_readonly_field = true:warning
dotnet_style_require_accessibility_modifiers = for_non_interface_members
end_of_line = crlf
file_header_template = unset
indent_size = 4
indent_style = space
insert_final_newline = false
root = true
tab_width = 4
# https://docs.microsoft.com/en-us/dotnet/fundamentals/code-analysis/quality-rules/ca1822
# https://github.com/dotnet/aspnetcore/blob/main/.editorconfig
# https://github.com/dotnet/project-system/blob/main/.editorconfig
# Question
csharp_prefer_simple_using_statement = false # Question
csharp_style_expression_bodied_constructors = when_on_single_line:none # Question
csharp_style_expression_bodied_properties = true # Question
csharp_style_implicit_object_creation_when_type_is_apparent = true:warning # Question
csharp_style_pattern_matching_over_as_with_null_check = false # Question
csharp_style_prefer_pattern_matching = false # Question
csharp_style_prefer_range_operator = false # Question
csharp_style_prefer_switch_expression = false # Question
csharp_style_unused_value_assignment_preference = unused_local_variable # Question
csharp_style_unused_value_expression_statement_preference = false # Question
csharp_style_var_elsewhere = false:none # Question
csharp_style_var_for_built_in_types = false:none # Question
csharp_style_var_when_type_is_apparent = false:warning # Question
dotnet_diagnostic.CA1001.severity = none # Question - Types that own disposable fields should be disposable
dotnet_diagnostic.CA1051.severity = none # Question - Do not declare visible instance fields
dotnet_diagnostic.CA1416.severity = none # Question - This call site is reachable on all platforms.
dotnet_diagnostic.CA1510.severity = none # Question - Use
dotnet_diagnostic.CA1834.severity = none # CA1834: Consider using 'StringBuilder.Append(char)' when applicable
dotnet_diagnostic.CA1860.severity = none # Question - Avoid using
dotnet_diagnostic.CA1862.severity = none # Question - Prefer using
dotnet_diagnostic.CA2208.severity = none # Question - Instantiate argument exceptions correctly
dotnet_diagnostic.CA2211.severity = none # Question - Non-constant fields should not be visible
dotnet_diagnostic.CA2249.severity = none # Question - Use
dotnet_diagnostic.CA2253.severity = none # Question - Named placeholders should not be numeric values
dotnet_diagnostic.CS0103.severity = none # Question - The name
dotnet_diagnostic.CS0168.severity = none # Question - The variable
dotnet_diagnostic.CS0219.severity = none # Question - The variable
dotnet_diagnostic.CS0612.severity = none # Question - is obsolete
dotnet_diagnostic.CS0618.severity = none # Question - Compiler Warning (level 2)
dotnet_diagnostic.CS0659.severity = none # Question - Compiler Warning (level 3)
dotnet_diagnostic.CS8019.severity = warning # Question - Unnecessary using directive.
dotnet_diagnostic.CS8600.severity = none # Question - Converting null literal or possible null value to non-nullable type
dotnet_diagnostic.CS8602.severity = none # Question - Dereference of a possibly null reference.
dotnet_diagnostic.CS8603.severity = none # Question - Possible null reference return
dotnet_diagnostic.CS8604.severity = none # Question - Possible null reference argument for parameter.
dotnet_diagnostic.CS8618.severity = none # Question - Non-nullable variable must contain a non-null value when exiting constructor
dotnet_diagnostic.CS8625.severity = none # Question - Cannot convert null literal to non-nullable reference type.
dotnet_diagnostic.CS8629.severity = none # Question - Nullable value type may be null
dotnet_diagnostic.CS8765.severity = none # Question - Nullability of type of parameter
dotnet_diagnostic.IDE0005.severity = none # Question - Remove unnecessary using directives
dotnet_diagnostic.IDE0008.severity = warning # Question - Use explicit type instead of
dotnet_diagnostic.IDE0017.severity = none # Question - Object initialization can be simplified
dotnet_diagnostic.IDE0019.severity = none # Question - Use pattern matching
dotnet_diagnostic.IDE0021.severity = none # Question - Use expression body for constructor
dotnet_diagnostic.IDE0022.severity = none # Question - Use expression body for method
dotnet_diagnostic.IDE0025.severity = none # Question - Use expression body for property
dotnet_diagnostic.IDE0027.severity = none # Question - Use expression body for accessor
dotnet_diagnostic.IDE0028.severity = none # Question - Use collection initializers or expressions
dotnet_diagnostic.IDE0031.severity = none # Question - Null check can be simplified
dotnet_diagnostic.IDE0032.severity = none # Question - Use auto property
dotnet_diagnostic.IDE0037.severity = none # Question - Member name can be simplified
dotnet_diagnostic.IDE0041.severity = none # Question - Null check can be simplified
dotnet_diagnostic.IDE0047.severity = none # Question - Parentheses preferences
dotnet_diagnostic.IDE0049.severity = warning # Question - Name can be simplified
dotnet_diagnostic.IDE0051.severity = none # Question - Remove unused private member
dotnet_diagnostic.IDE0053.severity = none # Question - Use expression body for lambdas
dotnet_diagnostic.IDE0054.severity = none # Question - Use compound assignment
dotnet_diagnostic.IDE0055.severity = none # Question - Formatting rule
dotnet_diagnostic.IDE0057.severity = none # Question - Substring can be simplified
dotnet_diagnostic.IDE0058.severity = none # Question - Remove unnecessary expression value
dotnet_diagnostic.IDE0059.severity = none # Question - Unnecessary assignment of a value to
dotnet_diagnostic.IDE0060.severity = none # Question - Remove unused parameter
dotnet_diagnostic.IDE0063.severity = none # Question - Use simple
dotnet_diagnostic.IDE0065.severity = none # Question -
dotnet_diagnostic.IDE0066.severity = none # Question - Use
dotnet_diagnostic.IDE0078.severity = none # Question - Use pattern matching (may change code meaning)
dotnet_diagnostic.IDE0090.severity = warning # Question - Simplify new expression
dotnet_diagnostic.IDE0100.severity = error # Question - Remove redundant equality
dotnet_diagnostic.IDE0160.severity = warning # Question - Use block-scoped namespace
dotnet_diagnostic.IDE0161.severity = warning # Question - Namespace declaration preferences
dotnet_diagnostic.IDE0270.severity = none # Question - Null check can be simplified
dotnet_diagnostic.IDE0300.severity = none # Question - Collection initialization can be simplified
dotnet_diagnostic.IDE1006.severity = none # Question - Use collection expression for builder dotnet_style_prefer_collection_expression
dotnet_style_null_propagation = false # Question
dotnet_style_object_initializer = false # Question
dotnet_style_prefer_auto_properties = false # Question
dotnet_style_allow_statement_immediately_after_block_experimental = true # Question
dotnet_style_prefer_inferred_anonymous_type_member_names = false:warning # Question
dotnet_style_prefer_is_null_check_over_reference_equality_method = false # Question

View File

@ -0,0 +1,83 @@
using System.Globalization;
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI5;
internal static partial class Helper20250218 {
internal static void MoveToArchive(ILogger<Worker> logger, List<string> args) {
string checkDirectory;
string searchMES = args[4];
string searchPattern = args[3];
string searchSequence = args[5];
string destinationRoot = args[6];
string checkDirectoryName = args[2];
string sourceDirectory = Path.GetFullPath(args[0]);
string[] directories = Directory.GetDirectories(sourceDirectory, "*", SearchOption.TopDirectoryOnly);
foreach (string directory in directories) {
checkDirectory = Path.Combine(directory, checkDirectoryName);
if (!Directory.Exists(checkDirectory))
continue;
MoveToArchive(logger, searchPattern, searchMES, searchSequence, destinationRoot, checkDirectory);
}
}
private static void MoveToArchive(ILogger<Worker> logger, string searchPattern, string searchMES, string searchSequence, string destinationRoot, string checkDirectory) {
string[] files = Directory.GetFiles(checkDirectory, searchPattern, SearchOption.AllDirectories);
if (files.Length == 0)
logger.LogInformation("<{files}>(s)", files.Length);
else
MoveToArchive(logger, searchMES, searchSequence, destinationRoot, files);
}
private static void MoveToArchive(ILogger<Worker> logger, string searchMES, string searchSequence, string destinationRoot, string[] files) {
string mes;
string text;
string sequence;
string checkFile;
string[] matches;
FileInfo fileInfo;
string weekOfYear;
string[] segments;
string[] segmentsB;
string[] segmentsC;
string checkDirectory;
Calendar calendar = new CultureInfo("en-US").Calendar;
foreach (string file in files) {
fileInfo = new(file);
if (string.IsNullOrEmpty(fileInfo.DirectoryName))
continue;
text = File.ReadAllText(file);
segments = text.Split(searchMES);
if (segments.Length < 2)
continue;
segmentsB = text.Split(searchSequence);
if (segmentsB.Length < 2)
continue;
mes = segments[1].Split(';')[0];
sequence = segmentsB[1].Split(';')[0];
segmentsC = Path.GetFileName(fileInfo.DirectoryName).Split('-');
weekOfYear = $"{fileInfo.LastWriteTime.Year}_Week_{calendar.GetWeekOfYear(fileInfo.LastWriteTime, CalendarWeekRule.FirstDay, DayOfWeek.Sunday):00}";
checkDirectory = Path.GetFullPath(Path.Combine(destinationRoot, mes, weekOfYear, fileInfo.LastWriteTime.ToString("yyyy-MM-dd")));
if (!Directory.Exists(checkDirectory)) {
logger.LogInformation("<{checkDirectory}>", checkDirectory);
continue;
}
matches = Directory.GetDirectories(checkDirectory, sequence, SearchOption.AllDirectories);
if (matches.Length != 1) {
logger.LogInformation("!= 1 <{checkDirectory}>", checkDirectory);
continue;
}
checkFile = segmentsC.Length == 2 ? Path.Combine(matches[0], $"csv-{segmentsC[1]}-{fileInfo.Name}") : Path.Combine(matches[0], $"csv-{fileInfo.Name}");
if (File.Exists(checkFile)) {
logger.LogInformation("csv- {segmentsC} <{checkDirectory}>", segmentsC.Length, checkDirectory);
continue;
}
File.Move(fileInfo.FullName, checkFile);
}
}
}
// L:\DevOps\MESA_FI\file-folder-helper\bin\Debug\net8.0\win-x64>dotnet File-Folder-Helper.dll X \\mesfs.infineon.com\EC_EDA\Production\Traces Day-Helper-2025-02-18 Source *.pdsf A_MES_ENTITY= B_SEQUENCE= \\mesfs.infineon.com\EC_Characterization_Si\Archive

View File

@ -0,0 +1,372 @@
using System.Collections.ObjectModel;
using System.Text;
using System.Text.Json;
using System.Text.Json.Serialization;
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI5;
internal static partial class Helper20250219 {
private record ProcessDataStandardFormat(ReadOnlyCollection<string> Body,
ReadOnlyCollection<string> Columns,
ReadOnlyCollection<string> Logistics,
long? Sequence);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(JsonElement[]))]
private partial class JsonElementCollectionSourceGenerationContext : JsonSerializerContext {
}
private record ProcessDataStandardFormatMapping(ReadOnlyCollection<string> BackfillColumns,
ReadOnlyCollection<int> ColumnIndices,
ReadOnlyCollection<string> IgnoreColumns,
ReadOnlyCollection<string> IndexOnlyColumns,
ReadOnlyDictionary<string, string> KeyValuePairs,
ReadOnlyCollection<string> NewColumnNames,
ReadOnlyCollection<string> OldColumnNames);
internal static void Compare(ILogger<Worker> logger, List<string> args) {
string[] segmentsB;
List<string> distinct = [];
string searchPattern = args[2];
string searchPatternB = args[3];
string[] segments = args[7].Split(',');
Dictionary<string, string> keyValuePairs = [];
ReadOnlyCollection<string> ignore = args[4].Split(',').AsReadOnly();
ReadOnlyCollection<string> backfill = args[5].Split(',').AsReadOnly();
ReadOnlyCollection<string> indexOnly = args[6].Split(',').AsReadOnly();
ReadOnlyCollection<string> oldColumnNames = args[8].Split(',').AsReadOnly();
ReadOnlyCollection<string> newColumnNames = args[9].Split(',').AsReadOnly();
ReadOnlyCollection<int> columnIndices = args[10].Split(',').Select(int.Parse).ToArray().AsReadOnly();
foreach (string segment in segments) {
segmentsB = segment.Split('|');
if (segmentsB.Length != 2)
continue;
if (distinct.Contains(segmentsB[0]))
continue;
distinct.Add(segmentsB[0]);
keyValuePairs.Add(segmentsB[0], segmentsB[1]);
}
ProcessDataStandardFormatMapping processDataStandardFormatMapping = new(BackfillColumns: backfill,
ColumnIndices: columnIndices,
NewColumnNames: newColumnNames,
IgnoreColumns: ignore,
IndexOnlyColumns: indexOnly,
KeyValuePairs: keyValuePairs.AsReadOnly(),
OldColumnNames: oldColumnNames);
string sourceDirectory = Path.GetFullPath(args[0]);
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.AllDirectories);
logger.LogInformation("<{files}>(s)", files.Length);
Compare(logger, sourceDirectory.Length, searchPatternB, processDataStandardFormatMapping, files);
}
private static void Compare(ILogger<Worker> logger, int sourceDirectoryLength, string searchPattern, ProcessDataStandardFormatMapping pdsfMapping, string[] files) {
bool compare;
string directory;
string[] matches;
string directorySegment;
string[] directoryFiles;
const int columnsLine = 6;
JsonElement[]? jsonElementsNew;
JsonElement[]? jsonElementsOld;
ProcessDataStandardFormat processDataStandardFormat;
FileInfo[] collection = files.Select(l => new FileInfo(l)).ToArray();
foreach (FileInfo fileInfo in collection) {
directory = fileInfo.DirectoryName ?? throw new Exception();
directoryFiles = Directory.GetFiles(directory, searchPattern, SearchOption.TopDirectoryOnly);
matches = (from l in directoryFiles where l != fileInfo.FullName select l).ToArray();
if (matches.Length < 1)
continue;
directorySegment = directory[sourceDirectoryLength..];
processDataStandardFormat = GetProcessDataStandardFormat(logger, fileInfo.LastWriteTime, pdsfMapping.NewColumnNames.Count, columnsLine, fileInfo.FullName, lines: null);
jsonElementsNew = GetArray(logger, pdsfMapping.NewColumnNames.Count, processDataStandardFormat, lookForNumbers: false);
if (jsonElementsNew is null)
continue;
if (pdsfMapping.OldColumnNames.Count == pdsfMapping.ColumnIndices.Count) {
processDataStandardFormat = GetProcessDataStandardFormat(logger, pdsfMapping, jsonElementsNew, processDataStandardFormat);
Write(logger, fileInfo, processDataStandardFormat);
}
foreach (string match in matches) {
processDataStandardFormat = GetProcessDataStandardFormat(logger, fileInfo.LastWriteTime, pdsfMapping.OldColumnNames.Count, columnsLine, match, lines: null);
jsonElementsOld = GetArray(logger, pdsfMapping.OldColumnNames.Count, processDataStandardFormat, lookForNumbers: false);
if (jsonElementsOld is null || jsonElementsOld.Length != jsonElementsNew.Length) {
logger.LogWarning("! <{match}> (jsonElementsOld.Length:{jsonElementsOld} != jsonElementsNew.Length:{jsonElementsNew})", match, jsonElementsOld?.Length, jsonElementsNew.Length);
continue;
}
compare = Compare(logger, pdsfMapping, directorySegment, jsonElementsNew, jsonElementsOld);
if (!compare) {
logger.LogWarning("! <{match}>", match);
continue;
}
logger.LogInformation("<{match}>", match);
}
}
}
private static bool Compare(ILogger<Worker> logger, ProcessDataStandardFormatMapping processDataStandardFormatMapping, string directory, JsonElement[] jsonElementsNew, JsonElement[] jsonElementsOld) {
bool result;
int? q;
string valueNew;
string valueOld;
List<string> columns = [];
JsonProperty jsonPropertyOld;
JsonProperty jsonPropertyNew;
List<string> columnPairs = [];
JsonProperty[] jsonPropertiesOld;
JsonProperty[] jsonPropertiesNew;
List<string> unknownColumns = [];
List<string> differentColumns = [];
int last = jsonElementsOld.Length - 1;
List<string> sameAfterSpaceSplitColumns = [];
for (int i = last; i > -1; i--) {
if (jsonElementsOld[i].ValueKind != JsonValueKind.Object) {
unknownColumns.Add(string.Empty);
break;
}
jsonPropertiesOld = jsonElementsOld[i].EnumerateObject().ToArray();
jsonPropertiesNew = jsonElementsNew[i].EnumerateObject().ToArray();
for (int p = 0; p < jsonPropertiesOld.Length; p++) {
jsonPropertyOld = jsonPropertiesOld[p];
valueOld = jsonPropertyOld.Value.ToString();
if (processDataStandardFormatMapping.KeyValuePairs.TryGetValue(jsonPropertyOld.Name, out string? name) && !string.IsNullOrEmpty(name)) {
q = TryGetPropertyIndex(jsonPropertiesNew, name);
if (q is null && i == 0)
unknownColumns.Add($"{jsonPropertyOld.Name}|{name}");
} else {
q = TryGetPropertyIndex(jsonPropertiesNew, jsonPropertyOld.Name);
if (q is null) {
if (i == 0)
unknownColumns.Add(jsonPropertyOld.Name);
}
}
if (q is null) {
if (processDataStandardFormatMapping.IgnoreColumns.Contains(jsonPropertyOld.Name)) {
if (i == last) {
columns.Add("-1");
columnPairs.Add($"{jsonPropertyOld.Name}:");
logger.LogDebug("{p} )) {jsonPropertyOld.Name} **", p, jsonPropertyOld.Name);
}
continue;
}
if (i == last) {
columns.Add("-1");
columnPairs.Add($"{jsonPropertyOld.Name}:");
if (!string.IsNullOrEmpty(valueOld))
logger.LogDebug("{p} )) {jsonPropertyOld.Name} ??", p, jsonPropertyOld.Name);
}
} else {
jsonPropertyNew = jsonPropertiesNew[q.Value];
if (i == last) {
columns.Add(q.Value.ToString());
columnPairs.Add($"{jsonPropertyOld.Name}:{jsonPropertyNew.Name}");
}
valueNew = jsonPropertyNew.Value.ToString();
if (i == last)
logger.LogDebug("{p} )) {jsonPropertyOld.Name} ~~ {q.Value} => {jsonPropertyNew.Name}", p, jsonPropertyOld.Name, q.Value, jsonPropertyNew.Name);
if (valueNew != valueOld && !differentColumns.Contains(jsonPropertyOld.Name)) {
if (valueNew.Length >= 2 && valueNew.Split(' ')[0] == valueOld)
sameAfterSpaceSplitColumns.Add(jsonPropertyOld.Name);
else {
if (processDataStandardFormatMapping.BackfillColumns.Contains(jsonPropertyOld.Name) && i != last)
continue;
if (processDataStandardFormatMapping.IndexOnlyColumns.Contains(jsonPropertyOld.Name) && int.TryParse(jsonPropertyOld.Name[^2..], out int index) && i != index - 1)
continue;
logger.LogWarning("For [{jsonProperty.Name}] <{directory}> doesn't match (valueNew:{valueNew} != valueOld:{valueOld})!", jsonPropertyOld.Name, directory, valueNew, valueOld);
differentColumns.Add(jsonPropertyOld.Name);
}
}
}
}
if (i == last) {
logger.LogInformation(string.Join(',', columns));
logger.LogInformation($"{string.Join(';', columnPairs)};");
}
}
result = unknownColumns.Count == 0 && differentColumns.Count == 0 && sameAfterSpaceSplitColumns.Count == 0;
return result;
}
private static int? TryGetPropertyIndex(JsonProperty[] jsonProperties, string propertyName) {
int? result = null;
for (int i = 0; i < jsonProperties.Length; i++) {
if (jsonProperties[i].Name != propertyName)
continue;
result = i;
break;
}
if (result is null) {
for (int i = 0; i < jsonProperties.Length; i++) {
if (jsonProperties[i].Name[0] != propertyName[0])
continue;
if (jsonProperties[i].Name.Length != propertyName.Length)
continue;
if (jsonProperties[i].Name != propertyName)
continue;
result = i;
break;
}
}
return result;
}
private static ProcessDataStandardFormat GetProcessDataStandardFormat(ILogger<Worker> logger, DateTime lastWriteTime, int expectedColumns, int columnsLine, string path, string[]? lines) {
ProcessDataStandardFormat result;
long sequence;
string[] segments;
List<string> body = [];
List<string> logistics = [];
bool lookForLogistics = false;
lines ??= File.ReadAllLines(path);
if (lines.Length <= columnsLine)
segments = [];
else {
segments = lines[columnsLine].Split('\t');
if (segments.Length != expectedColumns) {
logger.LogWarning("{segments} != {expectedColumns}", segments.Length, expectedColumns);
segments = [];
}
}
string[] columns = segments.Select(l => l.Trim('"')).ToArray();
for (int r = columnsLine + 1; r < lines.Length; r++) {
if (lines[r].StartsWith("NUM_DATA_ROWS"))
lookForLogistics = true;
if (!lookForLogistics) {
body.Add(lines[r]);
continue;
}
if (lines[r].StartsWith("LOGISTICS_1")) {
for (int i = r; i < lines.Length; i++) {
if (lines[r].StartsWith("END_HEADER"))
break;
logistics.Add(lines[i]);
}
break;
}
}
if (logistics.Count == 0)
sequence = lastWriteTime.Ticks;
else {
segments = logistics[0].Split("SEQUENCE=");
sequence = segments.Length < 2 || !long.TryParse(segments[1].Split(';')[0], out long s) ? lastWriteTime.Ticks : s;
}
result = new(Body: body.AsReadOnly(),
Columns: columns.AsReadOnly(),
Logistics: logistics.AsReadOnly(),
Sequence: sequence);
return result;
}
private static JsonElement[]? GetArray(ILogger<Worker> logger, int expectedColumns, ProcessDataStandardFormat processDataStandardFormat, bool lookForNumbers) {
JsonElement[]? results;
if (processDataStandardFormat.Body.Count == 0 || !processDataStandardFormat.Body[0].Contains('\t'))
results = JsonSerializer.Deserialize("[]", JsonElementCollectionSourceGenerationContext.Default.JsonElementArray) ?? throw new Exception();
else {
string value;
string[] segments;
List<string> lines = [];
StringBuilder stringBuilder = new();
foreach (string bodyLine in processDataStandardFormat.Body) {
_ = stringBuilder.Clear();
_ = stringBuilder.Append('{');
segments = bodyLine.Split('\t');
if (segments.Length != expectedColumns) {
logger.LogWarning("{segments} != {expectedColumns}", segments.Length, expectedColumns);
continue;
}
if (!lookForNumbers) {
for (int c = 0; c < segments.Length; c++) {
value = segments[c].Replace("\"", "\\\"").Replace("\\", "\\\\");
_ = stringBuilder.Append('"').Append(processDataStandardFormat.Columns[c]).Append("\":\"").Append(value).Append("\",");
}
} else {
for (int c = 0; c < segments.Length; c++) {
value = segments[c].Replace("\"", "\\\"").Replace("\\", "\\\\");
if (string.IsNullOrEmpty(value))
_ = stringBuilder.Append('"').Append(processDataStandardFormat.Columns[c]).Append("\":").Append(value).Append("null,");
else if (value.All(char.IsDigit))
_ = stringBuilder.Append('"').Append(processDataStandardFormat.Columns[c]).Append("\":").Append(value).Append(',');
else
_ = stringBuilder.Append('"').Append(processDataStandardFormat.Columns[c]).Append("\":\"").Append(value).Append("\",");
}
}
_ = stringBuilder.Remove(stringBuilder.Length - 1, 1);
_ = stringBuilder.AppendLine("}");
lines.Add(stringBuilder.ToString());
}
string json = $"[{string.Join(',', lines)}]";
results = JsonSerializer.Deserialize(json, JsonElementCollectionSourceGenerationContext.Default.JsonElementArray);
}
return results;
}
private static ProcessDataStandardFormat GetProcessDataStandardFormat(ILogger<Worker> logger, ProcessDataStandardFormatMapping processDataStandardFormatMapping, JsonElement[] jsonElements, ProcessDataStandardFormat processDataStandardFormat) {
ProcessDataStandardFormat result;
int column;
string value;
List<string> values = [];
List<string> results = [];
JsonProperty jsonProperty;
JsonProperty[] jsonProperties;
List<string> unknownColumns = [];
for (int i = 0; i < jsonElements.Length; i++) {
values.Clear();
if (jsonElements[i].ValueKind != JsonValueKind.Object) {
unknownColumns.Add(string.Empty);
break;
}
jsonProperties = jsonElements[i].EnumerateObject().ToArray();
if (jsonProperties.Length != processDataStandardFormatMapping.NewColumnNames.Count) {
logger.LogWarning("{jsonProperties} != {NewColumnNames}", jsonProperties.Length, processDataStandardFormatMapping.NewColumnNames.Count);
continue;
}
for (int c = 0; c < processDataStandardFormatMapping.ColumnIndices.Count; c++) {
column = processDataStandardFormatMapping.ColumnIndices[c];
if (column == -1)
value = processDataStandardFormatMapping.OldColumnNames[c];
else {
jsonProperty = jsonProperties[column];
value = jsonProperty.Value.ToString();
}
values.Add(value);
}
results.Add(string.Join('\t', values));
}
result = new(Body: new(results),
Columns: processDataStandardFormatMapping.OldColumnNames,
Logistics: processDataStandardFormat.Logistics,
Sequence: processDataStandardFormat.Sequence);
return result;
}
private static void Write(ILogger<Worker> logger, FileInfo fileInfo, ProcessDataStandardFormat processDataStandardFormat) {
List<string> results = [];
if (processDataStandardFormat.Sequence is null)
throw new NullReferenceException(nameof(processDataStandardFormat.Sequence));
string endOffset = "E#######T";
string dataOffset = "D#######T";
string headerOffset = "H#######T";
string format = "MM/dd/yyyy HH:mm:ss";
string startTime = new DateTime(processDataStandardFormat.Sequence.Value).ToString(format);
results.Add("HEADER_TAG\tHEADER_VALUE");
results.Add("FORMAT\t2.00");
results.Add("NUMBER_PASSES\t0001");
results.Add($"HEADER_OFFSET\t{headerOffset}");
results.Add($"DATA_OFFSET\t{dataOffset}");
results.Add($"END_OFFSET\t{endOffset}");
results.Add($"\"{string.Join("\"\t\"", processDataStandardFormat.Columns)}\"");
results.AddRange(processDataStandardFormat.Body);
results.Add($"NUM_DATA_ROWS\t{processDataStandardFormat.Body.Count.ToString().PadLeft(9, '0')}");
results.Add($"NUM_DATA_COLUMNS\t{processDataStandardFormat.Columns.Count.ToString().PadLeft(9, '0')}");
results.Add("DELIMITER\t;");
results.Add($"START_TIME_FORMAT\t{format}");
results.Add($"START_TIME\t{startTime}");
results.Add("LOGISTICS_COLUMN\tA_LOGISTICS");
results.Add("LOGISTICS_COLUMN\tB_LOGISTICS");
results.AddRange(processDataStandardFormat.Logistics);
File.WriteAllText($"{fileInfo.FullName}.tsv", string.Join(Environment.NewLine, results));
logger.LogDebug("<{fileInfo}>", fileInfo);
}
}

View File

@ -0,0 +1,111 @@
using System.Collections.ObjectModel;
using System.Text.Json;
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI5;
internal static partial class Helper20250228 {
private record Record(string TableName, ReadOnlyCollection<string> Columns, ReadOnlyCollection<string[]> Rows);
internal static void PostgresDumpToJson(ILogger<Worker> logger, List<string> args) {
string searchPattern = args[2];
string headerA = args[3].Replace('_', ' ');
string headerB = args[4].Replace('_', ' ');
string sourceDirectory = Path.GetFullPath(args[0]);
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.AllDirectories);
if (files.Length != 1)
logger.LogWarning("<{files}>(s)", files.Length);
else
PostgresDumpToJson(logger, headerA, headerB, files[0]);
}
private static void PostgresDumpToJson(ILogger<Worker> logger, string headerA, string headerB, string file) {
ReadOnlyCollection<Record> records = GetRecords(headerA, headerB, file);
if (records.Count > 0)
WriteFile(file, records);
else
logger.LogWarning("<{records}>(s)", records.Count);
}
private static ReadOnlyCollection<Record> GetRecords(string headerA, string headerB, string file) {
List<Record> results = [];
string line;
string[] segmentsA;
string[] segmentsB;
string[] segmentsC;
string[] segmentsD;
string[] segmentsE;
string[] segmentsF;
List<string[]> rows;
string? tableName = null;
string[] lines = File.ReadAllLines(file);
ReadOnlyCollection<string>? columns = null;
for (int i = 0; i < lines.Length; i++) {
line = lines[i];
if (tableName is null) {
segmentsA = line.Split(headerA);
if (segmentsA.Length != 2)
continue;
segmentsB = segmentsA[1].Split(headerB);
if (segmentsB.Length != 2)
continue;
segmentsC = segmentsB[0].Split('(');
if (segmentsC.Length != 2)
continue;
segmentsD = segmentsC[1].Split(')');
if (segmentsD.Length != 2)
continue;
columns = segmentsD[0].Split(',').Select(l => l.Trim(' ').Trim('"')).ToArray().AsReadOnly();
if (columns.Count == 0)
continue;
segmentsE = segmentsB[0].Split(' ');
tableName = segmentsE[0];
} else if (columns is null)
break;
else {
rows = [];
for (int j = i + 1; j < lines.Length; j++) {
i = j;
segmentsF = lines[j].Split('\t');
if (segmentsF.Length != columns.Count) {
if (rows.Count > 0)
results.Add(new(TableName: tableName, Columns: columns, Rows: rows.AsReadOnly()));
break;
}
rows.Add(segmentsF);
}
columns = null;
tableName = null;
}
}
return results.AsReadOnly();
}
private static void WriteFile(string file, ReadOnlyCollection<Record> records) {
List<string> results = [];
string json;
string text;
Dictionary<string, string?> keyValuePairs = [];
foreach (Record record in records) {
results.Clear();
foreach (string[] row in record.Rows) {
keyValuePairs.Clear();
for (int i = 0; i < row.Length; i++) {
if (row[i] == "\\N")
keyValuePairs.Add(record.Columns[i], null);
else
keyValuePairs.Add(record.Columns[i], row[i]);
}
#pragma warning disable IL3050, IL2026
json = JsonSerializer.Serialize(keyValuePairs);
#pragma warning restore IL3050, IL2026
results.Add(json);
}
text = string.Join($",{Environment.NewLine}", results);
File.WriteAllText($"{file[..^4]}-{record.TableName}.json", $"[{Environment.NewLine}{text}{Environment.NewLine}]");
}
}
}

View File

@ -0,0 +1,56 @@
using System.Collections.ObjectModel;
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI5;
internal static partial class Helper20250301 {
internal static void PocketBaseImportWithDeno(ILogger<Worker> logger, List<string> args) {
char split = args[3][0];
string directory = args[6];
string scriptName = args[5];
string searchPattern = args[2];
string sourceDirectory = Path.GetFullPath(args[0]);
string workingDirectory = Path.GetFullPath(args[4]);
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.AllDirectories);
if (files.Length == 0)
logger.LogWarning("<{files}>(s)", files.Length);
else
PocketBaseImportWithDeno(logger, split, workingDirectory, scriptName, directory, files);
}
private static void PocketBaseImportWithDeno(ILogger<Worker> logger, char split, string workingDirectory, string scriptName, string directory, string[] files) {
string checkFile = Path.Combine(workingDirectory, scriptName);
if (!File.Exists(checkFile))
logger.LogWarning("<{checkFile}> doesn't exist!", checkFile);
else {
ReadOnlyCollection<string> fileNames = CopyFiles(split, workingDirectory, directory, files);
if (fileNames.Count == 0)
logger.LogWarning("<{fileNames}>(s)", fileNames.Count);
else {
foreach (string fileName in fileNames)
logger.LogInformation("deno run --unstable --allow-read --allow-env --allow-net {scriptName} --id=true --input={fileName}", scriptName, fileName);
}
}
}
private static ReadOnlyCollection<string> CopyFiles(char split, string workingDirectory, string directory, string[] files) {
List<string> results = [];
string fileName;
string checkFile;
string checkDirectory = Path.Combine(workingDirectory, directory);
if (!Directory.Exists(checkDirectory))
_ = Directory.CreateDirectory(checkDirectory);
foreach (string file in files) {
fileName = Path.GetFileName(file).Split(split)[^1];
checkFile = Path.Combine(checkDirectory, fileName);
if (File.Exists(checkFile))
File.Delete(checkFile);
File.Copy(file, checkFile);
results.Add(fileName);
}
return results.AsReadOnly();
}
}

View File

@ -0,0 +1,140 @@
using System.Collections.ObjectModel;
using System.Text.Json;
using File_Folder_Helper.Models;
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI5;
internal static partial class Helper20250305 {
private static readonly HttpClient _HttpClient = new();
private record Record(Uri URI, string Path, DateTime LastModified, int? TotalSeconds);
internal static void WriteNginxFileSystemDelta(ILogger<Worker> logger, List<string> args) {
string host = args[2];
string rootDirectoryName = args[3];
string format = NginxFileSystem.GetFormat();
TimeZoneInfo timeZoneInfo = TimeZoneInfo.Local;
string compareDirectory = Path.GetFullPath(args[0]);
logger.LogInformation("Comparing files on {host}", host);
ReadOnlyCollection<Record> records = GetRecords(format, timeZoneInfo, host, new([rootDirectoryName]), compareDirectory);
#if ShellProgressBar
ProgressBar progressBar = new(records.Count, "Downloading", new ProgressBarOptions() { ProgressCharacter = '─', ProgressBarOnBottom = true, DisableBottomPercentage = true });
#endif
foreach (Record record in records) {
#if ShellProgressBar
progressBar.Tick();
#endif
if (record.TotalSeconds is null)
Download(record);
else if (record.TotalSeconds.Value == 0)
logger.LogInformation("Different lengths");
else if (record.TotalSeconds.Value > 0)
logger.LogInformation("Overwrite remote (https)");
else
logger.LogInformation("Overwrite local");
}
#if ShellProgressBar
progressBar.Dispose();
#endif
}
private static ReadOnlyCollection<Record> GetRecords(string format, TimeZoneInfo timeZoneInfo, string host, ReadOnlyCollection<string> directoryNames, string compareDirectory) {
List<Record> results = [];
Uri uri = new($"https://{host}/{string.Join('/', directoryNames)}");
ReadOnlyCollection<NginxFileSystem>? nginxFileSystems = GetCollection(format, timeZoneInfo, uri);
if (nginxFileSystems is not null) {
NginxFileSystem nginxFileSystem;
ReadOnlyCollection<Record> records;
string checkDirectory = $"{compareDirectory}\\{string.Join('\\', directoryNames)}";
if (!Directory.Exists(checkDirectory))
_ = Directory.CreateDirectory(checkDirectory);
for (int i = 0; i < nginxFileSystems.Count; i++) {
nginxFileSystem = NginxFileSystem.Get(format, timeZoneInfo, uri, nginxFileSystems[i]);
if (nginxFileSystem.Type == "file") {
Record? record = CompareFile(host, directoryNames, compareDirectory, nginxFileSystem);
if (record is not null)
results.Add(record);
} else {
records = CompareDirectory(format, timeZoneInfo, host, directoryNames, compareDirectory, nginxFileSystem);
foreach (Record record in records)
results.Add(record);
}
}
}
return results.AsReadOnly();
}
private static ReadOnlyCollection<NginxFileSystem>? GetCollection(string format, TimeZoneInfo timeZoneInfo, Uri uri) {
List<NginxFileSystem>? results;
Task<HttpResponseMessage> taskHttpResponseMessage = _HttpClient.GetAsync(uri);
taskHttpResponseMessage.Wait();
if (!taskHttpResponseMessage.Result.IsSuccessStatusCode)
results = null;
else {
Task<string> taskString = taskHttpResponseMessage.Result.Content.ReadAsStringAsync();
taskString.Wait();
if (taskString.Result.StartsWith('<'))
results = null;
else {
NginxFileSystem[]? nginxFileSystems = JsonSerializer.Deserialize(taskString.Result, NginxFileSystemCollectionSourceGenerationContext.Default.NginxFileSystemArray);
if (nginxFileSystems is null)
results = null;
else {
results = [];
NginxFileSystem nginxFileSystem;
for (int i = 0; i < nginxFileSystems.Length; i++) {
nginxFileSystem = NginxFileSystem.Get(format, timeZoneInfo, uri, nginxFileSystems[i]);
results.Add(nginxFileSystem);
}
}
}
}
return results?.AsReadOnly();
}
private static Record? CompareFile(string host, ReadOnlyCollection<string> directoryNames, string compareDirectory, NginxFileSystem nginxFileSystem) {
Record? result;
if (nginxFileSystem.LastModified is null || nginxFileSystem.Length is null)
result = null;
else {
Uri uri = new($"https://{host}/{string.Join('/', directoryNames)}/{nginxFileSystem.Name}");
FileInfo fileInfo = new($"{compareDirectory}\\{string.Join('\\', directoryNames)}\\{nginxFileSystem.Name}");
if (!fileInfo.Exists)
result = new(URI: uri, Path: fileInfo.FullName, LastModified: nginxFileSystem.LastModified.Value, TotalSeconds: null);
else {
int totalSeconds = (int)new TimeSpan(fileInfo.LastWriteTime.Ticks - nginxFileSystem.LastModified.Value.Ticks).TotalSeconds;
if (totalSeconds is not < 2 or not > -2)
result = new(URI: uri, Path: fileInfo.FullName, LastModified: nginxFileSystem.LastModified.Value, TotalSeconds: totalSeconds);
else if (fileInfo.Length != nginxFileSystem.Length.Value)
result = new(URI: uri, Path: fileInfo.FullName, LastModified: nginxFileSystem.LastModified.Value, TotalSeconds: 0);
else
result = null;
}
}
return result;
}
private static ReadOnlyCollection<Record> CompareDirectory(string format, TimeZoneInfo timeZoneInfo, string host, ReadOnlyCollection<string> directoryNames, string compareDirectory, NginxFileSystem nginxFileSystem) {
ReadOnlyCollection<Record> results;
List<string> collection = directoryNames.ToList();
collection.Add(nginxFileSystem.Name);
results = GetRecords(format, timeZoneInfo, host, collection.AsReadOnly(), compareDirectory);
return results;
}
private static void Download(Record record) {
Task<HttpResponseMessage> taskHttpResponseMessage = _HttpClient.GetAsync(record.URI);
taskHttpResponseMessage.Wait();
if (taskHttpResponseMessage.Result.IsSuccessStatusCode) {
Task<string> taskString = taskHttpResponseMessage.Result.Content.ReadAsStringAsync();
taskString.Wait();
File.WriteAllText(record.Path, taskString.Result);
File.SetLastWriteTime(record.Path, record.LastModified);
}
}
}

View File

@ -0,0 +1,85 @@
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI5;
internal static partial class Helper20250306 {
internal static void ProcessDataStandardFormatToJson(ILogger<Worker> logger, List<string> args) {
string searchPattern = args[2];
string sourceDirectory = Path.GetFullPath(args[0]);
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.TopDirectoryOnly);
if (files.Length != 1)
logger.LogWarning("<{files}>(s)", files.Length);
else
ProcessDataStandardFormatToJson(logger, files[0]);
}
private static void ProcessDataStandardFormatToJson(ILogger<Worker> logger, string file) {
string[] lines = File.ReadAllLines(file);
int? columnTitlesLine = GetProcessDataStandardFormatColumnTitlesLine(lines);
if (columnTitlesLine is null)
logger.LogWarning("<{columnTitlesLine}> is null", nameof(columnTitlesLine));
else {
string? text = ProcessDataStandardFormatToLastDataLine(lines, columnTitlesLine.Value);
File.WriteAllText(Path.Combine(".vscode", "helper", ".lbl"), text);
if (lines.Length < columnTitlesLine.Value + 1)
logger.LogWarning("<{lines}>(s)", lines.Length);
else {
string json = ProcessDataStandardFormatToJson(columnTitlesLine.Value, [], lines);
File.WriteAllText(Path.Combine(".vscode", "helper", ".json"), json);
}
}
}
private static int? GetProcessDataStandardFormatColumnTitlesLine(string[] lines) {
int? result = null;
bool foundEndOfFile = false;
for (int i = 0; i < lines.Length; i++) {
if (lines[i] == "EOF")
foundEndOfFile = true;
if (foundEndOfFile && lines[i].StartsWith("END_OFFSET") && i + 3 < lines.Length) {
result = i + 2;
break;
}
}
return result;
}
private static string? ProcessDataStandardFormatToLastDataLine(string[] lines, int columnTitlesLine) {
string? result = null;
for (int i = columnTitlesLine + 1; i < lines.Length; i++) {
if (lines[i].StartsWith("NUM_DATA_ROWS")) {
result = lines[i - 2];
break;
}
}
return result;
}
private static string ProcessDataStandardFormatToJson(int columnTitlesLine, string[] columns, string[] lines) {
#pragma warning disable CA1845, IDE0057
string result = "[\n";
string line;
string value;
string[] segments;
if (columns.Length == 0)
columns = lines[columnTitlesLine].Trim().Split('|');
int columnsLength = columns.Length - 2;
for (int i = columnTitlesLine + 1; i < lines.Length; i++) {
line = "{";
segments = lines[i].Trim().Split('|');
if (segments.Length != columnsLength)
continue;
for (int c = 1; c < segments.Length; c++) {
value = segments[c].Replace("\"", "\\\"").Replace("\\", "\\\\");
line += '"' + columns[c].Trim('"') + '"' + ':' + '"' + value + '"' + ',';
}
line = line.Substring(0, line.Length - 1) + '}' + ',' + '\n';
result += line;
}
result = result.Substring(0, result.Length - 2) + ']';
return result;
#pragma warning restore CA1845, IDE0057
}
}

View File

@ -0,0 +1,46 @@
using File_Folder_Helper.Helpers;
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI5;
internal static partial class Helper20250315 {
internal static void Empty(ILogger<Worker> logger, List<string> args) {
string[] searchPatterns = args[2].Split('~');
string sourceDirectory = Path.GetFullPath(args[0]);
if (searchPatterns.Length == 1) {
string[] files = Directory.GetFiles(sourceDirectory, searchPatterns[0], SearchOption.AllDirectories);
if (files.Length == 0)
logger.LogWarning("<{files}>(s)", files.Length);
else {
string directoryName;
string[] directories;
foreach (string file in files) {
directoryName = Path.GetDirectoryName(file) ?? throw new Exception();
directories = Directory.GetDirectories(directoryName, "*", SearchOption.TopDirectoryOnly);
foreach (string directory in directories)
HelperDeleteEmptyDirectories.DeleteEmptyDirectories(logger, directory);
}
}
} else {
string[] files;
string checkFile;
HelperDeleteEmptyDirectories.DeleteEmptyDirectories(logger, sourceDirectory);
foreach (string searchPattern in searchPatterns) {
files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.AllDirectories);
if (files.Length == 0)
logger.LogWarning("<{files}>(s)", files.Length);
else {
foreach (string file in files) {
checkFile = $"{file}.json";
if (File.Exists(checkFile))
continue;
File.Move(file, checkFile);
}
}
}
}
}
}

View File

@ -0,0 +1,521 @@
using System.Collections.ObjectModel;
using System.Diagnostics;
using System.Text.Json;
using System.Text.Json.Serialization;
using System.Text.RegularExpressions;
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI5;
internal static partial class Helper20250320 {
private record Match(string Name,
string Parameters,
string Result,
string Scope,
string Static,
string Value,
string Async,
string Partial);
private record Search(string Constructor,
string Delegate,
string Name,
string Not,
string Wrap);
private record Method(int? EndLine,
ReadOnlyDictionary<string, string> Parameters,
string FirstLine,
int I,
string Line,
Match Match,
ReadOnlyCollection<int> ReferenceToLineNumbers,
int? ScopeEnum,
Search Search,
int StartLine) {
public override string ToString() {
string result = JsonSerializer.Serialize(this, MethodCollectionCommonSourceGenerationContext.Default.Method);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Method[]))]
private partial class MethodCollectionCommonSourceGenerationContext : JsonSerializerContext {
}
private record MethodWith(int? EndLine,
ReadOnlyDictionary<string, string> Parameters,
string FirstLine,
string Line,
Match Match,
ReadOnlyCollection<MethodWith> References,
ReadOnlyCollection<int> ReferenceToLineNumbers,
int? ScopeEnum,
Search Search,
int StartLine) {
public override string ToString() {
string result = JsonSerializer.Serialize(this, MethodCollectionCommonSourceGenerationContext.Default.Method);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(MethodWith[]))]
private partial class MethodWithCollectionCommonSourceGenerationContext : JsonSerializerContext {
}
private const string _Name = "name";
private const string _Async = "async";
private const string _Scope = "scope";
private const string _Result = "result";
private const string _Static = "static";
private const string _Partial = "partial";
private const string _Parameters = "parameters";
[GeneratedRegex(@"[[\]<,>?a-zA-Z0-9_()\s]*?\s[a-z_]{1}[a-zA-Z0-9_]*?,")]
private static partial Regex CSharpParameter();
// VSCode Search ^\s*\b(?<scope>public|private|internal|protected|\sI[a-zA-Z0-9_]*\.)\s?\b(?<static>static)?\s?\b(?<partial>partial)?\s?\b(?<async>async)?\s?\b(?<result>[\[\]\.\?<,>a-zA-Z0-9_()\s]*?)\s?\b(?<name>[A-Z_]{1}[a-zA-Z0-9_])+\((?<parameters>.*)\)\s?\{?$
[GeneratedRegex(@"^\s*\b(?<scope>public|private|internal|protected|\sI[a-zA-Z0-9_]*\.)\s?\b(?<static>static)?\s?\b(?<partial>partial)?\s?\b(?<async>async)?\s?\b(?<result>[\[\]\.\?<,>a-zA-Z0-9_()\s]*?)\s?\b(?<name>[A-Z_]{1}[a-zA-Z0-9_]*)+\((?<parameters>.*)\)\s?\{?$")]
private static partial Regex CSharpMethodLine();
private static ReadOnlyCollection<Method> GetSortedMethods(ReadOnlyCollection<Method> methods) =>
(from l in methods orderby l.ScopeEnum descending, l.ReferenceToLineNumbers.Count descending, l.Line.Length, l.Match.Name.Length, l.Match.Name select l).ToArray().AsReadOnly();
internal static void SortCodeMethods(ILogger<Worker> logger, List<string> args, CancellationToken cancellationToken) {
bool check;
string[] lines;
List<string> changed = [];
bool usePathCombine = true;
long ticks = DateTime.Now.Ticks;
bool logOnly = bool.Parse(args[2]);
int scopeSpaces = int.Parse(args[3]);
logger.LogInformation("{ticks}", ticks);
string repositoryDirectory = Path.GetFullPath(args[0]);
string[] cSharpFiles = Directory.GetFiles(repositoryDirectory, "*.cs", SearchOption.AllDirectories);
ReadOnlyCollection<string> gitOthersModifiedAndDeletedExcludingStandardFiles = logOnly ? new(cSharpFiles) : Helpers.HelperGit.GetOthersModifiedAndDeletedExcludingStandardFiles(repositoryDirectory, usePathCombine, cancellationToken);
foreach (string cSharpFile in cSharpFiles) {
if (!gitOthersModifiedAndDeletedExcludingStandardFiles.Contains(cSharpFile))
continue;
for (int i = 0; i < 10; i++) {
lines = File.ReadAllLines(cSharpFile);
check = SortFile(logger, logOnly, scopeSpaces, cSharpFile, lines);
if (check) {
Thread.Sleep(500);
changed.Add($"{i + 1:00}) {cSharpFile}");
}
if (logOnly || !check)
break;
}
}
if (changed.Count == 0)
logger.LogInformation("No changes :)");
else {
changed.Reverse();
foreach (string c in changed)
logger.LogInformation(c);
}
}
private static bool SortFile(ILogger<Worker> logger, bool logOnly, int scopeSpaces, string cSharpFile, string[] lines) {
bool result;
ReadOnlyCollection<Method> methods = GetMethods(logger, scopeSpaces, cSharpFile, lines);
if (methods.Count == 0)
result = false;
else if (methods.Any(l => l.EndLine is null))
result = false;
else if (logOnly) {
foreach (Method method in methods)
logger.LogInformation("{cSharpFile} - {Name} has {lines} line(s)", cSharpFile, method.Match.Name, (method.EndLine is null ? 999999 : method.EndLine.Value - method.StartLine).ToString("000000"));
result = false;
} else {
ReadOnlyCollection<Method> sortedMethods = GetSortedMethods(methods);
if (Debugger.IsAttached)
File.WriteAllText(Path.Combine(".vscode", "helper", ".json"), JsonSerializer.Serialize(sortedMethods.ToArray(), MethodCollectionCommonSourceGenerationContext.Default.MethodArray));
ReadOnlyCollection<MethodWith> collection = GetCollection(logger, lines, sortedMethods);
result = WriteAllLines(cSharpFile, lines, collection);
}
return result;
}
private static ReadOnlyCollection<Method> GetMethods(ILogger<Worker> logger, int scopeSpaces, string cSharpFile, string[] lines) {
List<Method> results = [];
int check;
int blocks;
bool isLinq;
Match match;
string line;
int? endLine;
int startLine;
Method method;
Search search;
int? scopeEnum;
string firstLine;
string innerLine;
string lineSegmentFirst;
List<int> referenceToLineNumbers;
Regex parameterRegex = CSharpParameter();
ReadOnlyDictionary<string, string> parameters;
System.Text.RegularExpressions.Match regularExpressionsMatch;
for (int i = 0; i < lines.Length; i++) {
check = GetNumberOfStartSpaces(lines, i);
if (check != scopeSpaces)
continue;
line = lines[i].Trim();
if (string.IsNullOrEmpty(line))
continue;
if (line.Length < 5)
continue;
if (line.EndsWith(','))
continue;
regularExpressionsMatch = CSharpMethodLine().Match(line);
if (!regularExpressionsMatch.Success)
continue;
match = new(Async: regularExpressionsMatch.Groups[_Async].Value,
Name: regularExpressionsMatch.Groups[_Name].Value,
Parameters: regularExpressionsMatch.Groups[_Parameters].Value,
Partial: regularExpressionsMatch.Groups[_Partial].Value,
Result: regularExpressionsMatch.Groups[_Result].Value,
Scope: regularExpressionsMatch.Groups[_Scope].Value,
Static: regularExpressionsMatch.Groups[_Static].Value,
Value: regularExpressionsMatch.Value);
scopeEnum = GetScopeEnum(match);
parameters = GetParameters(parameterRegex, match);
search = new(Constructor: $"{match.Name.ToLower()} = new(",
Delegate: $" += {match.Name};",
Name: $" {match.Name}(",
Not: $"!{match.Name}(",
Wrap: $"({match.Name}(");
logger.LogInformation("{line} {a} // {results}", line.Split(" =>")[0], "{ }", results.Count);
if (string.IsNullOrEmpty(match.Name))
continue;
blocks = 0;
startLine = GetStartLine(lines, i);
if (!lines[startLine].StartsWith("#pragma") && !lines[startLine].StartsWith("#nullable"))
firstLine = lines[startLine].Trim();
else
firstLine = lines[startLine + 1].Trim();
isLinq = !lines[i + 1].StartsWith("#pragma") && !lines[i + 1].StartsWith("#nullable") && lines[i].Trim()[^1] != '{' && lines[i + 1].Trim() != "{";
if (isLinq)
blocks++;
endLine = null;
if (lines[i].Trim()[^1] == '{')
blocks++;
for (int j = i + 1; j < lines.Length; j++) {
innerLine = lines[j].Trim();
if (innerLine.StartsWith("#pragma") || innerLine.StartsWith("#nullable"))
continue;
if (isLinq && string.IsNullOrEmpty(innerLine)) {
if (line.EndsWith(';'))
blocks--;
}
blocks += GetLineBlockCount(innerLine, isLinq);
if (blocks != 0)
continue;
endLine = j;
if (lines.Length > j + 1 && string.IsNullOrEmpty(lines[j + 1].Trim()))
endLine++;
if (j > lines.Length - 2)
throw new Exception();
break;
}
referenceToLineNumbers = GetReferenceToLineNumbers(lines: lines, start: 0, end: lines.Length, i: i, search: search, parameters: parameters);
if (referenceToLineNumbers.Count == 0) {
lineSegmentFirst = line.Split(match.Name)[0];
if (!lines[i - 1].Trim().StartsWith("[Obsolete")) {
if (lineSegmentFirst.StartsWith("private"))
logger.LogWarning("// <{cSharpFileName}> {name} with {parameters} parameter(s) <{line}>", Path.GetFileName(cSharpFile), match.Name, parameters, lineSegmentFirst);
else
logger.LogInformation("// <{cSharpFileName}> {name} with {parameters} parameter(s) <{line}>", Path.GetFileName(cSharpFile), match.Name, parameters, lineSegmentFirst);
}
}
if (referenceToLineNumbers.Count == 0)
referenceToLineNumbers.Add(-1);
logger.LogInformation("{line} {a} // {results} ~~~ {startLine} => {firstUsedLine}", line.Split(" =>")[0], "{ }", results.Count, startLine, referenceToLineNumbers.First());
method = new(EndLine: endLine,
FirstLine: firstLine,
I: i,
Line: line,
Match: match,
Parameters: parameters,
ReferenceToLineNumbers: referenceToLineNumbers.AsReadOnly(),
Search: search,
ScopeEnum: scopeEnum,
StartLine: startLine);
results.Add(method);
}
return results.AsReadOnly();
}
private static int GetNumberOfStartSpaces(string[] lines, int i) {
int result = 0;
foreach (char @char in lines[i]) {
if (@char != ' ')
break;
result += 1;
}
return result;
}
private static int GetScopeEnum(Match match) {
int result;
int value = match.Scope switch {
"public" => 8000,
"internal" => 7000,
"protected" => 6000,
"private" => 5000,
_ => match.Scope.Length > 2
&& match.Scope[..2] == " I"
&& match.Scope[^1] == '.' ? 9000 : throw new NotImplementedException()
};
result = value
+ (string.IsNullOrEmpty(match.Result) ? 100 : 0)
+ (string.IsNullOrEmpty(match.Static) ? 0 : 10)
+ (string.IsNullOrEmpty(match.Async) ? 0 : 1);
return result;
}
private static ReadOnlyDictionary<string, string> GetParameters(Regex parameterRegex, Match match) {
Dictionary<string, string> results = [];
string value;
string[] segments;
System.Text.RegularExpressions.Match[] matches = parameterRegex.Matches($"{match.Parameters},").ToArray();
try {
foreach (System.Text.RegularExpressions.Match m in matches) {
if (!m.Success)
continue;
value = m.Value.Trim()[..^1];
segments = value.Split(' ');
results.Add(segments[^1], value);
}
} catch (Exception) {
results.Clear();
System.Text.RegularExpressions.Match m;
for (int i = 0; i < matches.Length; i++) {
m = matches[i];
if (!m.Success)
continue;
results.Add(i.ToString(), i.ToString());
}
}
return new(results);
}
private static int GetStartLine(string[] lines, int i) {
int result = i;
string line;
for (int j = i - 1; j > -1; j--) {
line = lines[j].Trim();
if (!line.StartsWith('[') && !line.StartsWith('#') && !line.StartsWith("/// "))
break;
result--;
}
return result;
}
private static int GetLineBlockCount(string line, bool isLinq) {
int result = 0;
bool ignore = false;
for (int i = 0; i < line.Length; i++) {
if (line[i] == '\'')
i++;
else if (!isLinq && !ignore && line[i] == '{')
result++;
else if (!isLinq && !ignore && line[i] == '}')
result--;
else if (isLinq && !ignore && line[i] == ';')
result--;
else if (i > 0 && line[i] == '"' && line[i - 1] != '\\')
ignore = !ignore;
}
return result;
}
private static List<int> GetReferenceToLineNumbers(string[] lines, int start, int end, int i, Search search, ReadOnlyDictionary<string, string> parameters) {
List<int> results = [];
string[] segments;
string[] afterSegments;
string lastSegmentBeforeDot;
for (int j = start; j < end; j++) {
if (j == i)
continue;
segments = lines[j].Split(search.Name);
if (segments.Length == 1) {
segments = lines[j].Split(search.Not);
if (segments.Length == 1) {
segments = lines[j].Split(search.Wrap);
if (segments.Length == 1) {
if (!lines[j].EndsWith(search.Delegate)) {
segments = lines[j].Split(search.Constructor);
if (segments.Length == 1)
continue;
}
}
}
}
if (lines[j].EndsWith(search.Delegate))
results.Add(j);
else {
lastSegmentBeforeDot = segments[^1].Split(").")[0];
if (parameters.Count == 0) {
if (lastSegmentBeforeDot.Contains(','))
continue;
} else {
afterSegments = lastSegmentBeforeDot.Split(',');
if (afterSegments.Length != parameters.Count)
continue;
}
results.Add(j);
}
}
return results;
}
private static ReadOnlyCollection<MethodWith> GetCollection(ILogger<Worker> logger, string[] lines, ReadOnlyCollection<Method> sortedMethods) {
List<MethodWith> results = [];
List<Method> check = sortedMethods.ToList();
foreach (Method method in sortedMethods) {
logger.LogInformation($"{method.Match.Name} => {method.Parameters.Count}");
if (method.EndLine is null)
continue;
if (!check.Remove(method))
continue;
MethodWith methodWith = GetMethodWith(lines, sortedMethods, check, method, method.EndLine.Value);
results.Add(methodWith);
}
return results.AsReadOnly();
}
private static MethodWith GetMethodWith(string[] lines, ReadOnlyCollection<Method> methods, List<Method> check, Method method, int methodEndLineValue) {
MethodWith methodWith;
List<int> referenceToLineNumbers;
MethodWith[] sortedReferences;
Dictionary<int, MethodWith> references = [];
foreach (Method m in methods) {
if (m.EndLine is null)
continue;
if (m == method)
continue;
referenceToLineNumbers = GetReferenceToLineNumbers(lines: lines, start: method.StartLine, end: methodEndLineValue, i: -1, search: m.Search, parameters: m.Parameters);
if (referenceToLineNumbers.Count > 0) {
if (!check.Remove(m))
continue;
foreach (int i in referenceToLineNumbers) {
if (references.ContainsKey(i))
continue;
methodWith = GetMethodWith(lines, methods, check, m, m.EndLine.Value);
references.Add(i, methodWith);
break;
}
}
}
if (references.Count < 2)
sortedReferences = (from l in references select l.Value).ToArray();
else
sortedReferences = (from l in references orderby l.Key select l.Value).ToArray();
methodWith = new(EndLine: method.EndLine,
FirstLine: method.FirstLine,
Line: method.Line,
Match: method.Match,
Parameters: method.Parameters,
References: new(sortedReferences),
ReferenceToLineNumbers: method.ReferenceToLineNumbers,
ScopeEnum: method.ScopeEnum,
Search: method.Search,
StartLine: method.StartLine);
return methodWith;
}
private static bool WriteAllLines(string cSharpFile, string[] lines, ReadOnlyCollection<MethodWith> collection) {
bool result;
if (Debugger.IsAttached)
WriteDebug(collection);
List<string> results = [];
ReadOnlyCollection<int> methodLines = GetMethodLines(collection);
int maxMethodLines = methodLines.Max();
for (int i = 0; i < maxMethodLines; i++) {
if (methodLines.Contains(i))
continue;
results.Add(lines[i]);
}
List<bool> nests = [true];
foreach (MethodWith methodWith in collection) {
if (methodWith.EndLine is null)
continue;
AppendLines(results, nests, lines, methodWith, methodWith.EndLine.Value);
}
for (int i = maxMethodLines + 1; i < lines.Length; i++)
results.Add(lines[i]);
string text = File.ReadAllText(cSharpFile);
string join = string.Join(Environment.NewLine, results);
if (join == text)
result = false;
else {
result = true;
File.WriteAllText(cSharpFile, join);
}
return result;
}
private static void WriteDebug(ReadOnlyCollection<MethodWith> collection) {
List<string> results = [];
List<bool> nests = [true];
foreach (MethodWith methodWith in collection)
AppendLines(results, nests, methodWith);
File.WriteAllText(Path.Combine(".vscode", "helper", ".md"), string.Join(Environment.NewLine, results));
}
private static void AppendLines(List<string> results, List<bool> nests, MethodWith methodWith) {
nests.Add(true);
results.Add($" - {new string('#', nests.Count)} {methodWith.Match.Name} => {methodWith.Parameters.Count}");
foreach (MethodWith m in methodWith.References)
AppendLines(results, nests, m);
nests.RemoveAt(nests.Count - 1);
}
private static ReadOnlyCollection<int> GetMethodLines(ReadOnlyCollection<MethodWith> collection) {
List<int> results = [];
List<bool> nests = [true];
foreach (MethodWith methodWith in collection) {
if (methodWith.EndLine is null)
continue;
AppendLineNumbers(results, nests, methodWith, methodWith.EndLine.Value);
}
int[] distinct = results.Distinct().ToArray();
if (distinct.Length != results.Count)
throw new Exception();
return new(results);
}
private static void AppendLineNumbers(List<int> results, List<bool> nests, MethodWith methodWith, int methodWithEndLineValue) {
nests.Add(true);
for (int i = methodWith.StartLine; i < methodWithEndLineValue + 1; i++)
results.Add(i);
foreach (MethodWith m in methodWith.References) {
if (m.EndLine is null)
continue;
AppendLineNumbers(results, nests, m, m.EndLine.Value);
}
nests.RemoveAt(nests.Count - 1);
}
private static void AppendLines(List<string> results, List<bool> nests, string[] lines, MethodWith methodWith, int methodWithEndLineValue) {
nests.Add(true);
for (int i = methodWith.StartLine; i < methodWithEndLineValue + 1; i++)
results.Add(lines[i]);
foreach (MethodWith m in methodWith.References) {
if (m.EndLine is null)
continue;
AppendLines(results, nests, lines, m, m.EndLine.Value);
}
nests.RemoveAt(nests.Count - 1);
}
}

View File

@ -0,0 +1,138 @@
using System.Collections.ObjectModel;
using File_Folder_Helper.Helpers;
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI5;
internal static partial class Helper20250321 {
private record Record(string Directory,
string File,
ThreeDeep ThreeDeep) {
public static ReadOnlyCollection<Record> GetCollection(string sourceDirectory, string searchPattern, string[] files) {
List<Record> results = [];
Record record;
string directory;
string fileNameWithoutExtension;
bool json = searchPattern.Contains(".json");
bool check = searchPattern.Split('.').Length == 3;
ReadOnlyCollection<ThreeDeep> collection = ThreeDeep.GetCollection(files);
foreach (ThreeDeep threeDeep in collection) {
if (!json && check)
fileNameWithoutExtension = threeDeep.DirectoryName;
else if (!json && !check)
fileNameWithoutExtension = threeDeep.FileNameWithoutExtension;
else if (json)
fileNameWithoutExtension = Path.GetFileNameWithoutExtension(threeDeep.FileNameWithoutExtension);
else
throw new NotImplementedException();
directory = $"{fileNameWithoutExtension[^1]}{fileNameWithoutExtension[^3..][..2]}";
if (json || (!json && !check)) {
record = new(Directory: Path.Combine(sourceDirectory, "new-a", directory),
File: $"{threeDeep.FileNameWithoutExtension}{threeDeep.Extension}",
ThreeDeep: threeDeep);
} else if (!json && check) {
record = new(Directory: Path.Combine(sourceDirectory, "new-b", directory, threeDeep.DirectoryName),
File: $"{threeDeep.FileNameWithoutExtension}{threeDeep.Extension}",
ThreeDeep: threeDeep);
} else
throw new NotImplementedException();
results.Add(record);
}
return results.AsReadOnly();
}
}
private record ThreeDeep(string Extension,
string FileNameWithoutExtension,
long LastModified,
long Length,
string DirectoryName,
string ParentDirectoryName,
string Root) {
public static ReadOnlyCollection<ThreeDeep> GetCollection(string[] files) {
List<ThreeDeep> results = [];
ThreeDeep record;
FileInfo fileInfo;
string parentDirectory;
foreach (string file in files) {
fileInfo = new(file);
parentDirectory = Path.GetDirectoryName(fileInfo.DirectoryName) ?? throw new Exception();
record = new(Extension: Path.GetExtension(file),
FileNameWithoutExtension: Path.GetFileNameWithoutExtension(file),
LastModified: fileInfo.LastWriteTime.Ticks,
Length: fileInfo.Length,
DirectoryName: Path.GetFileName(fileInfo.DirectoryName) ?? throw new Exception(),
ParentDirectoryName: Path.GetFileName(parentDirectory),
Root: Path.GetDirectoryName(parentDirectory) ?? throw new Exception());
results.Add(record);
}
return results.AsReadOnly();
}
public static string GetFullPath(ThreeDeep threeDeep) =>
Path.Combine(threeDeep.Root, threeDeep.ParentDirectoryName, threeDeep.DirectoryName, $"{threeDeep.FileNameWithoutExtension}{threeDeep.Extension}");
}
internal static void MoveToLast(ILogger<Worker> logger, List<string> args) {
string[] searchPatterns = args[2].Split('~');
string sourceDirectory = Path.GetFullPath(args[0]);
if (searchPatterns.Length == 1)
logger.LogInformation("No code for just one!");
else {
HelperDeleteEmptyDirectories.DeleteEmptyDirectories(logger, sourceDirectory);
ReadOnlyCollection<Record> collection = GetCollection(logger, searchPatterns, sourceDirectory);
if (collection.Count != 0)
UseCollection(collection);
else
logger.LogInformation("No files!");
if (collection.Count != 0)
HelperDeleteEmptyDirectories.DeleteEmptyDirectories(logger, sourceDirectory);
}
}
private static ReadOnlyCollection<Record> GetCollection(ILogger<Worker> logger, string[] searchPatterns, string sourceDirectory) {
string[] files;
List<Record> results = [];
foreach (string searchPattern in searchPatterns) {
files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.AllDirectories);
if (files.Length == 0)
logger.LogWarning("<{files}>(s)", files.Length);
else {
ReadOnlyCollection<Record> collection = Record.GetCollection(sourceDirectory, searchPattern, files);
results.AddRange(collection);
}
}
return results.AsReadOnly();
}
private static void UseCollection(ReadOnlyCollection<Record> collection) {
string fullPath;
string checkFile;
List<string> distinct = [];
foreach (Record record in collection) {
if (distinct.Contains(record.Directory))
continue;
distinct.Add(record.Directory);
}
foreach (string directory in distinct) {
if (Directory.Exists(directory))
continue;
_ = Directory.CreateDirectory(directory);
}
foreach (Record record in collection) {
checkFile = Path.Combine(record.Directory, record.File);
if (File.Exists(checkFile))
continue;
fullPath = ThreeDeep.GetFullPath(record.ThreeDeep);
File.Move(fullPath, checkFile);
}
}
}

View File

@ -0,0 +1,236 @@
using System.Text.Json;
using System.Text.Json.Serialization;
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI5;
internal static partial class Helper20250404 {
internal record KafkaProducerSaslOptions(
[property: JsonPropertyName("mechanism")] string Mechanism
);
internal record MonitorList(
[property: JsonPropertyName("id")] int Id,
[property: JsonPropertyName("name")] string Name,
[property: JsonPropertyName("description")] string Description,
[property: JsonPropertyName("pathName")] string PathName,
[property: JsonPropertyName("parent")] int? Parent,
[property: JsonPropertyName("childrenIDs")] IReadOnlyList<int> ChildrenIDs,
[property: JsonPropertyName("url")] string Url,
[property: JsonPropertyName("method")] string Method,
[property: JsonPropertyName("hostname")] object Hostname,
[property: JsonPropertyName("port")] object Port,
[property: JsonPropertyName("maxretries")] int MaxRetries,
[property: JsonPropertyName("weight")] int Weight,
[property: JsonPropertyName("active")] bool Active,
[property: JsonPropertyName("forceInactive")] bool ForceInactive,
[property: JsonPropertyName("type")] string Type,
[property: JsonPropertyName("timeout")] int Timeout,
[property: JsonPropertyName("interval")] int Interval,
[property: JsonPropertyName("retryInterval")] int RetryInterval,
[property: JsonPropertyName("resendInterval")] int ResendInterval,
[property: JsonPropertyName("keyword")] object Keyword,
[property: JsonPropertyName("invertKeyword")] bool InvertKeyword,
[property: JsonPropertyName("expiryNotification")] bool ExpiryNotification,
[property: JsonPropertyName("ignoreTls")] bool IgnoreTls,
[property: JsonPropertyName("upsideDown")] bool UpsideDown,
[property: JsonPropertyName("packetSize")] int PacketSize,
[property: JsonPropertyName("maxredirects")] int MaxRedirects,
[property: JsonPropertyName("accepted_statuscodes")] IReadOnlyList<string> AcceptedStatusCodes,
[property: JsonPropertyName("dns_resolve_type")] string DnsResolveType,
[property: JsonPropertyName("dns_resolve_server")] string DnsResolveServer,
[property: JsonPropertyName("dns_last_result")] object DnsLastResult,
[property: JsonPropertyName("docker_container")] string DockerContainer,
[property: JsonPropertyName("docker_host")] object DockerHost,
[property: JsonPropertyName("proxyId")] object ProxyId,
[property: JsonPropertyName("notificationIDList")] NotificationIDList NotificationIDList,
[property: JsonPropertyName("tags")] IReadOnlyList<object> Tags,
[property: JsonPropertyName("maintenance")] bool Maintenance,
[property: JsonPropertyName("mqttTopic")] string MqttTopic,
[property: JsonPropertyName("mqttSuccessMessage")] string MqttSuccessMessage,
[property: JsonPropertyName("databaseQuery")] object DatabaseQuery,
[property: JsonPropertyName("authMethod")] string AuthMethod,
[property: JsonPropertyName("grpcUrl")] object GrpcUrl,
[property: JsonPropertyName("grpcProtobuf")] object GrpcProtobuf,
[property: JsonPropertyName("grpcMethod")] object GrpcMethod,
[property: JsonPropertyName("grpcServiceName")] object GrpcServiceName,
[property: JsonPropertyName("grpcEnableTls")] bool GrpcEnableTls,
[property: JsonPropertyName("radiusCalledStationId")] object RadiusCalledStationId,
[property: JsonPropertyName("radiusCallingStationId")] object RadiusCallingStationId,
[property: JsonPropertyName("game")] object Game,
[property: JsonPropertyName("gamedigGivenPortOnly")] bool GameDigGivenPortOnly,
[property: JsonPropertyName("httpBodyEncoding")] string HttpBodyEncoding,
[property: JsonPropertyName("jsonPath")] object JsonPath,
[property: JsonPropertyName("expectedValue")] object ExpectedValue,
[property: JsonPropertyName("kafkaProducerTopic")] object KafkaProducerTopic,
[property: JsonPropertyName("kafkaProducerBrokers")] IReadOnlyList<object> KafkaProducerBrokers,
[property: JsonPropertyName("kafkaProducerSsl")] bool KafkaProducerSsl,
[property: JsonPropertyName("kafkaProducerAllowAutoTopicCreation")] bool KafkaProducerAllowAutoTopicCreation,
[property: JsonPropertyName("kafkaProducerMessage")] object KafkaProducerMessage,
[property: JsonPropertyName("screenshot")] object Screenshot,
[property: JsonPropertyName("headers")] object Headers,
[property: JsonPropertyName("body")] object Body,
[property: JsonPropertyName("grpcBody")] object GrpcBody,
[property: JsonPropertyName("grpcMetadata")] object GrpcMetadata,
[property: JsonPropertyName("basic_auth_user")] string BasicAuthUser,
[property: JsonPropertyName("basic_auth_pass")] string BasicAuthPass,
[property: JsonPropertyName("oauth_client_id")] object OauthClientId,
[property: JsonPropertyName("oauth_client_secret")] object OauthClientSecret,
[property: JsonPropertyName("oauth_token_url")] object OauthTokenUrl,
[property: JsonPropertyName("oauth_scopes")] object OauthScopes,
[property: JsonPropertyName("oauth_auth_method")] string OauthAuthMethod,
[property: JsonPropertyName("pushToken")] string PushToken,
[property: JsonPropertyName("databaseConnectionString")] string DatabaseConnectionString,
[property: JsonPropertyName("radiusUsername")] object RadiusUsername,
[property: JsonPropertyName("radiusPassword")] object RadiusPassword,
[property: JsonPropertyName("radiusSecret")] object RadiusSecret,
[property: JsonPropertyName("mqttUsername")] string MqttUsername,
[property: JsonPropertyName("mqttPassword")] string MqttPassword,
[property: JsonPropertyName("authWorkstation")] object AuthWorkstation,
[property: JsonPropertyName("authDomain")] object AuthDomain,
[property: JsonPropertyName("tlsCa")] object TlsCa,
[property: JsonPropertyName("tlsCert")] object TlsCert,
[property: JsonPropertyName("tlsKey")] object TlsKey,
[property: JsonPropertyName("kafkaProducerSaslOptions")] KafkaProducerSaslOptions KafkaProducerSaslOptions,
[property: JsonPropertyName("includeSensitiveData")] bool IncludeSensitiveData
);
internal record NotificationIDList(
[property: JsonPropertyName("4")] bool _4
);
internal record NotificationList(
[property: JsonPropertyName("id")] int Id,
[property: JsonPropertyName("name")] string Name,
[property: JsonPropertyName("active")] bool Active,
[property: JsonPropertyName("userId")] int UserId,
[property: JsonPropertyName("isDefault")] bool IsDefault,
[property: JsonPropertyName("config")] string Config
);
internal record Kuma(
[property: JsonPropertyName("version")] string Version,
[property: JsonPropertyName("notificationList")] IReadOnlyList<NotificationList> NotificationList,
[property: JsonPropertyName("monitorList")] IReadOnlyList<MonitorList> MonitorList
);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Kuma))]
private partial class KumaCommonSourceGenerationContext : JsonSerializerContext {
}
internal static void KumaToGatus(ILogger<Worker> logger, List<string> args) {
string url = args[4];
string fileName = args[3];
string searchPattern = args[2];
ParseMetrics(logger, fileName, url);
string sourceDirectory = Path.GetFullPath(args[0]);
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.AllDirectories);
if (files.Length == 0)
logger.LogWarning("<{files}>(s)", files.Length);
else
KumaToGatus(files);
}
private static void ParseMetrics(ILogger<Worker> logger, string fileName, string url) {
FileStream fileStream = new(fileName, FileMode.Truncate);
HttpClient httpClient = new();
Task<Stream> streamTask = httpClient.GetStreamAsync(url);
streamTask.Wait();
Task task = streamTask.Result.CopyToAsync(fileStream);
task.Wait();
ParseMetrics(logger, fileStream);
fileStream.Dispose();
streamTask.Dispose();
httpClient.Dispose();
}
private static void ParseMetrics(ILogger<Worker> _, FileStream __) {
// Task<List<IMetric>> metrics = PrometheusMetricsParser.ParseAsync(fileStream);
// metrics.Wait();
// foreach (IMetric metric in metrics.Result) {
// if (metric is not Gauge gauge)
// continue;
// foreach (GaugeMeasurement gaugeMeasurement in gauge.Measurements) {
// if (string.IsNullOrEmpty(metric.Name))
// continue;
// foreach (KeyValuePair<string, string> keyValuePair in gaugeMeasurement.Labels) {
// logger.LogInformation("name:{name}; timestamp:{timestamp}; value:{value}; key-name:{key-name}; key-value:{key-value}",
// metric.Name,
// gaugeMeasurement.Timestamp,
// gaugeMeasurement.Value,
// keyValuePair.Key,
// keyValuePair.Value);
// }
// }
// }
}
private static void KumaToGatus(string[] files) {
Kuma? kuma;
string json;
string checkFile;
foreach (string file in files) {
checkFile = file.ToLower().Replace('_', '-');
if (checkFile != file)
File.Move(file, checkFile);
json = File.ReadAllText(checkFile);
kuma = JsonSerializer.Deserialize(json, KumaCommonSourceGenerationContext.Default.Kuma);
if (kuma is null)
continue;
WriteGatus(checkFile, kuma);
}
}
private static void WriteGatus(string file, Kuma kuma) {
List<string> results = [
string.Empty,
$"# set GATUS_CONFIG_PATH=./{Path.GetFileName(file)}.yaml",
string.Empty,
"endpoints:"
];
string[] segments;
foreach (MonitorList monitorList in kuma.MonitorList) {
if (monitorList.Type is not "http" and not "postgres")
continue;
results.Add($" - name: {monitorList.Name}");
results.Add($" group: {monitorList.PathName.Split(' ')[0]}");
results.Add($" enabled: {monitorList.Active.ToString().ToLower()}");
results.Add($" interval: {monitorList.Interval}s");
if (monitorList.Type == "http") {
results.Add($" method: {monitorList.Method}");
results.Add($" url: \"{monitorList.Url}\"");
if (monitorList.AuthMethod == "basic") {
results.Add($" # user: \"{monitorList.BasicAuthUser}\"");
results.Add($" # password: \"{monitorList.BasicAuthPass}\"");
}
results.Add(" conditions:");
results.Add(" - \"[STATUS] < 300\"");
if (monitorList.Url.Contains("https"))
results.Add(" - \"[CERTIFICATE_EXPIRATION] > 48h\"");
results.Add($" - \"[RESPONSE_TIME] < {monitorList.Timeout}\"");
} else if (monitorList.Type == "postgres") {
segments = monitorList.DatabaseConnectionString.Split('@');
if (segments.Length != 2)
continue;
results.Add($" # connectionString: \"{monitorList.DatabaseConnectionString}\"");
results.Add($" url: \"tcp://{segments[1].Split('/')[0]}\"");
results.Add(" conditions:");
results.Add(" - \"[CONNECTED] == true\"");
} else
throw new NotImplementedException();
results.Add(" alerts:");
results.Add(" - type: email");
results.Add(" description: \"healthcheck failed\"");
results.Add(" send-on-resolved: true");
results.Add(" - type: gotify");
results.Add(" description: \"healthcheck failed\"");
results.Add(" send-on-resolved: true");
results.Add(string.Empty);
}
File.WriteAllText($"{file}.yaml", string.Join(Environment.NewLine, results));
}
}

View File

@ -0,0 +1,693 @@
using System.Collections.ObjectModel;
using System.Text.Json;
using System.Text.Json.Serialization;
using File_Folder_Helper.Models;
using Microsoft.Extensions.FileSystemGlobbing;
using Microsoft.Extensions.Logging;
#if ShellProgressBar
using ShellProgressBar;
#endif
namespace File_Folder_Helper.ADO2025.PI5;
internal static partial class Helper20250407 {
private record Record(string RelativePath,
long Size,
long Ticks);
private record Download(string Directory,
string Display,
string File,
long Size,
long Ticks,
string UniformResourceLocator);
private record Segment(Record? Left,
string? LeftDirectory,
Record? Right,
string RightDirectory,
string RootUniformResourceLocator);
private record Logic(char GreaterThan,
bool? LeftSideIsNewer,
int LeftSideIsNewerIndex,
bool? LeftSideOnly,
int LeftSideOnlyIndex,
char LessThan,
char Minus,
bool? NotEqualBut,
int NotEqualButIndex,
char Plus,
string[] Raw,
bool? RightSideIsNewer,
int RightSideIsNewerIndex,
bool? RightSideOnly,
int RightSideOnlyIndex) {
internal static Logic? Get(string[] segments) {
Logic? result;
bool check = true;
bool? notEqualBut;
bool? leftSideOnly;
bool? rightSideOnly;
bool? leftSideIsNewer;
const char plus = '+';
bool? rightSideIsNewer;
const char minus = '-';
const char lessThan = 'L';
const char greaterThan = 'G';
const int notEqualButIndex = 2;
const int leftSideOnlyIndex = 0;
const int rightSideOnlyIndex = 4;
const int leftSideIsNewerIndex = 1;
const int rightSideIsNewerIndex = 3;
if (string.IsNullOrEmpty(segments[leftSideOnlyIndex]))
leftSideOnly = null;
else if (segments[leftSideOnlyIndex][0] == plus)
leftSideOnly = true;
else if (segments[leftSideOnlyIndex][0] == minus)
leftSideOnly = false;
else {
check = false;
leftSideOnly = null;
}
if (string.IsNullOrEmpty(segments[leftSideIsNewerIndex]))
leftSideIsNewer = null;
else if (segments[leftSideIsNewerIndex][0] == greaterThan)
leftSideIsNewer = true;
else if (segments[leftSideIsNewerIndex][0] == lessThan)
leftSideIsNewer = false;
else {
check = false;
leftSideIsNewer = null;
}
if (string.IsNullOrEmpty(segments[notEqualButIndex]))
notEqualBut = null;
else if (segments[notEqualButIndex][0] == greaterThan)
notEqualBut = true;
else if (segments[notEqualButIndex][0] == lessThan)
notEqualBut = false;
else {
check = false;
notEqualBut = null;
}
if (string.IsNullOrEmpty(segments[rightSideIsNewerIndex]))
rightSideIsNewer = null;
else if (segments[rightSideIsNewerIndex][0] == greaterThan)
rightSideIsNewer = true;
else if (segments[rightSideIsNewerIndex][0] == lessThan)
rightSideIsNewer = false;
else {
check = false;
rightSideIsNewer = null;
}
if (string.IsNullOrEmpty(segments[rightSideOnlyIndex]))
rightSideOnly = null;
else if (segments[rightSideOnlyIndex][0] == plus)
rightSideOnly = true;
else if (segments[rightSideOnlyIndex][0] == minus)
rightSideOnly = false;
else {
check = false;
rightSideOnly = null;
}
result = !check ? null : new(GreaterThan: greaterThan,
LeftSideIsNewerIndex: leftSideIsNewerIndex,
LeftSideIsNewer: leftSideIsNewer,
LeftSideOnly: leftSideOnly,
LeftSideOnlyIndex: leftSideOnlyIndex,
LessThan: lessThan,
Minus: minus,
NotEqualBut: notEqualBut,
NotEqualButIndex: notEqualButIndex,
Plus: plus,
RightSideIsNewer: rightSideIsNewer,
RightSideIsNewerIndex: rightSideIsNewerIndex,
RightSideOnly: rightSideOnly,
Raw: segments,
RightSideOnlyIndex: rightSideOnlyIndex);
return result;
}
}
private record Review(Segment[]? AreEqual,
Segment[]? LeftSideIsNewer,
Segment[]? LeftSideOnly,
Segment[]? NotEqualBut,
Record[]? Records,
Segment[]? RightSideIsNewer,
Segment[]? RightSideOnly);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Review))]
private partial class ReviewCommonSourceGenerationContext : JsonSerializerContext {
}
internal static void Sync(ILogger<Worker> logger, List<string> args) {
Matcher matcher = new();
string fileName = $"{args[1]}.json";
string[] segments = args[5].Split('~');
string rightDirectory = Path.GetFullPath(args[0].Split('~')[0]);
Logic? logic = segments.Length != 5 ? null : Logic.Get(segments);
string includePatternsFile = Path.Combine(rightDirectory, args[2]);
string excludePatternsFile = Path.Combine(rightDirectory, args[3]);
string[] rootUniformResourceLocators = args.Count < 5 ? [] : args[4].Split('~');
matcher.AddIncludePatterns(!File.Exists(includePatternsFile) ? ["*"] : File.ReadAllLines(includePatternsFile));
matcher.AddExcludePatterns(!File.Exists(excludePatternsFile) ? ["System Volume Information"] : File.ReadAllLines(excludePatternsFile));
ReadOnlyCollection<Record> rightRecords = GetRecords(rightDirectory, matcher);
if (rightRecords.Count == 0)
logger.LogInformation("No source records");
else {
string checkFile = Path.Combine(rightDirectory, fileName);
Review review = new(AreEqual: null,
LeftSideIsNewer: null,
LeftSideOnly: null,
NotEqualBut: null,
Records: rightRecords.ToArray(),
RightSideIsNewer: null,
RightSideOnly: null);
string json = JsonSerializer.Serialize(review, ReviewCommonSourceGenerationContext.Default.Review);
WriteAllText(checkFile, json);
if (rootUniformResourceLocators.Length == 0)
logger.LogInformation("No urls");
else {
string format = NginxFileSystem.GetFormat();
TimeZoneInfo timeZoneInfo = TimeZoneInfo.Local;
Sync(logger, rightDirectory, fileName, logic, rootUniformResourceLocators, rightRecords, format, timeZoneInfo);
}
}
}
private static ReadOnlyCollection<Record> GetRecords(string rightDirectory, Matcher matcher) {
List<Record> results = [
new(RelativePath: rightDirectory,
Size: 0,
Ticks: 0)];
Record record;
FileInfo fileInfo;
string relativePath;
ReadOnlyCollection<ReadOnlyCollection<string>> collection = Helpers.HelperDirectory.GetFilesCollection(rightDirectory, "*", "*");
foreach (ReadOnlyCollection<string> c in collection) {
foreach (string f in c) {
if (!matcher.Match(rightDirectory, f).HasMatches)
continue;
fileInfo = new(f);
if (fileInfo.Length == 0)
continue;
relativePath = Path.GetRelativePath(rightDirectory, fileInfo.FullName);
record = new(RelativePath: relativePath,
Size: fileInfo.Length,
Ticks: fileInfo.LastWriteTime.ToUniversalTime().Ticks);
results.Add(record);
}
}
return results.AsReadOnly();
}
private static void WriteAllText(string path, string text) {
string check = !File.Exists(path) ? string.Empty : File.ReadAllText(path);
if (check != text)
File.WriteAllText(path, text);
}
private static void Sync(ILogger<Worker> logger, string rightDirectory, string fileName, Logic? logic, string[] rootUniformResourceLocators, ReadOnlyCollection<Record> rightRecords, string format, TimeZoneInfo timeZoneInfo) {
Review? review;
foreach (string rootUniformResourceLocator in rootUniformResourceLocators) {
if (!rootUniformResourceLocator.StartsWith("https:"))
logger.LogInformation("Not supported URL <{url}>", rootUniformResourceLocator);
else {
review = GetJsonResponse(logger, fileName, rootUniformResourceLocator, format, timeZoneInfo);
if (review?.Records is null || review.Records.Length == 0)
logger.LogInformation("No response records");
else {
ReadOnlyCollection<Record> leftRecords = review.Records.AsReadOnly();
Sync(logger, rightDirectory, fileName, logic, rightRecords, rootUniformResourceLocator, leftRecords);
}
}
}
}
private static Review? GetJsonResponse(ILogger<Worker> logger, string fileName, string rootUniformResourceLocator, string format, TimeZoneInfo timeZoneInfo) {
Review? result;
Task<string> response;
HttpClient httpClient = new();
Task<HttpResponseMessage> httpResponseMessage;
string url = new(rootUniformResourceLocator.EndsWith('/') ?
$"{rootUniformResourceLocator[..^1]}/{fileName}" :
$"{rootUniformResourceLocator}/{fileName}");
httpResponseMessage = httpClient.GetAsync(rootUniformResourceLocator);
httpResponseMessage.Wait();
if (!httpResponseMessage.Result.IsSuccessStatusCode) {
logger.LogInformation("Failed to download: <{rootUniformResourceLocator}>;", rootUniformResourceLocator);
result = null;
} else {
response = httpResponseMessage.Result.Content.ReadAsStringAsync();
response.Wait();
NginxFileSystem[]? nginxFileSystems = JsonSerializer.Deserialize(response.Result, NginxFileSystemCollectionSourceGenerationContext.Default.NginxFileSystemArray);
bool isNewest = nginxFileSystems is not null && IsNewest(fileName, format, timeZoneInfo, new(rootUniformResourceLocator), nginxFileSystems);
if (nginxFileSystems is null) {
logger.LogInformation("Failed to parse: <{rootUniformResourceLocator}>;", rootUniformResourceLocator);
result = null;
} else if (!isNewest) {
logger.LogInformation("Outdated remote file: <{rootUniformResourceLocator}>;", rootUniformResourceLocator);
result = null;
} else {
httpResponseMessage = httpClient.GetAsync(url);
httpResponseMessage.Wait();
if (!httpResponseMessage.Result.IsSuccessStatusCode) {
logger.LogInformation("Failed to download: <{url}>;", url);
result = null;
} else {
response = httpResponseMessage.Result.Content.ReadAsStringAsync();
response.Wait();
result = string.IsNullOrEmpty(response.Result) ?
null :
JsonSerializer.Deserialize(response.Result, ReviewCommonSourceGenerationContext.Default.Review);
}
}
}
return result;
}
private static bool IsNewest(string fileName, string format, TimeZoneInfo timeZoneInfo, Uri uri, NginxFileSystem[] nginxFileSystems) {
bool result;
DateTime? match = null;
NginxFileSystem nginxFileSystem;
DateTime dateTime = DateTime.MinValue;
for (int i = 0; i < nginxFileSystems.Length; i++) {
nginxFileSystem = NginxFileSystem.Get(format, timeZoneInfo, uri, nginxFileSystems[i]);
if (nginxFileSystem.LastModified is not null && nginxFileSystem.Name == fileName) {
match = nginxFileSystem.LastModified.Value;
continue;
}
if (nginxFileSystem.LastModified is null || nginxFileSystem.LastModified <= dateTime)
continue;
dateTime = nginxFileSystem.LastModified.Value;
}
result = match is not null && match.Value > dateTime;
return result;
}
private static void Sync(ILogger<Worker> logger, string rightDirectory, string fileName, Logic? l, ReadOnlyCollection<Record> rightRecords, string rootUniformResourceLocators, ReadOnlyCollection<Record> leftRecords) {
string json;
string checkFile;
HttpClient httpClient = new();
checkFile = Path.Combine(rightDirectory, fileName);
if (File.Exists(checkFile))
File.Delete(checkFile);
ReadOnlyCollection<Segment> areEqual = GetAreEqual(rightDirectory, fileName, rightRecords, rootUniformResourceLocators, leftRecords);
ReadOnlyCollection<Segment> notEqualBut = GetNotEqualBut(rightDirectory, fileName, rightRecords, rootUniformResourceLocators, leftRecords);
ReadOnlyCollection<Segment> leftSideOnly = GetLeftSideOnly(rightDirectory, fileName, rightRecords, rootUniformResourceLocators, leftRecords);
ReadOnlyCollection<Segment> rightSideOnly = GetRightSideOnly(rightDirectory, fileName, rightRecords, rootUniformResourceLocators, leftRecords);
ReadOnlyCollection<Segment> leftSideIsNewer = GetLeftSideIsNewer(rightDirectory, fileName, rightRecords, rootUniformResourceLocators, leftRecords);
ReadOnlyCollection<Segment> rightSideIsNewer = GetRightSideIsNewer(rightDirectory, fileName, rightRecords, rootUniformResourceLocators, leftRecords);
Review review = new(AreEqual: areEqual.ToArray(),
LeftSideIsNewer: leftSideIsNewer.ToArray(),
LeftSideOnly: leftSideOnly.ToArray(),
NotEqualBut: notEqualBut.ToArray(),
Records: null,
RightSideIsNewer: rightSideIsNewer.ToArray(),
RightSideOnly: rightSideOnly.ToArray());
json = JsonSerializer.Serialize(review, ReviewCommonSourceGenerationContext.Default.Review);
checkFile = Path.Combine(rightDirectory, fileName);
WriteAllText(checkFile, json);
if (notEqualBut.Count > 0 && l is not null && l.NotEqualBut is not null && l.Raw[l.NotEqualButIndex][0] == l.Minus && !l.NotEqualBut.Value)
logger.LogDebug("Doing nothing with {name}", nameof(Logic.NotEqualBut));
if (leftSideOnly.Count > 0 && l is not null && l.LeftSideOnly is not null && l.Raw[l.LeftSideOnlyIndex][0] == l.Minus && !l.LeftSideOnly.Value)
throw new NotImplementedException("Not possible with https!");
if (leftSideIsNewer.Count > 0 && l is not null && l.LeftSideIsNewer is not null && l.Raw[l.LeftSideIsNewerIndex][0] == l.LessThan && !l.LeftSideIsNewer.Value)
throw new NotImplementedException("Not possible with https!");
if (rightSideIsNewer.Count > 0 && l is not null && l.RightSideIsNewer is not null && l.Raw[l.RightSideIsNewerIndex][0] == l.LessThan && !l.RightSideIsNewer.Value)
throw new NotImplementedException("Not possible with https!");
if (rightSideOnly.Count > 0 && l is not null && l.RightSideOnly is not null && l.Raw[l.RightSideOnlyIndex][0] == l.Plus && l.RightSideOnly.Value)
throw new NotImplementedException("Not possible with https!");
if (rightSideOnly.Count > 0 && l is not null && l.RightSideOnly is not null && l.Raw[l.RightSideOnlyIndex][0] == l.Minus && !l.RightSideOnly.Value)
DoWork(logger, rightDirectory, httpClient, rightSideOnly, delete: true, download: false);
if (leftSideOnly.Count > 0 && l is not null && l.LeftSideOnly is not null && l.Raw[l.LeftSideOnlyIndex][0] == l.Plus && l.LeftSideOnly.Value)
DoWork(logger, rightDirectory, httpClient, leftSideOnly, delete: false, download: true);
if (leftSideIsNewer.Count > 0 && l is not null && l.LeftSideIsNewer is not null && l.Raw[l.LeftSideIsNewerIndex][0] == l.GreaterThan && l.LeftSideIsNewer.Value)
DoWork(logger, rightDirectory, httpClient, leftSideIsNewer, delete: true, download: true);
if (notEqualBut.Count > 0 && l is not null && l.NotEqualBut is not null && l.Raw[l.NotEqualButIndex][0] == l.Plus && l.NotEqualBut.Value)
DoWork(logger, rightDirectory, httpClient, notEqualBut, delete: true, download: true);
if (rightSideIsNewer.Count > 0 && l is not null && l.RightSideIsNewer is not null && l.Raw[l.RightSideIsNewerIndex][0] == l.GreaterThan && l.RightSideIsNewer.Value)
DoWork(logger, rightDirectory, httpClient, rightSideIsNewer, delete: true, download: true);
}
private static ReadOnlyCollection<Segment> GetAreEqual(string rightDirectory, string fileName, ReadOnlyCollection<Record> rightRecords, string rootUniformResourceLocators, ReadOnlyCollection<Record> leftRecords) {
List<Segment> results = [];
Record? record;
Segment segment;
double totalSeconds;
string? checkDirectory = null;
ReadOnlyDictionary<string, Record> keyValuePairs = GetKeyValuePairs(rightRecords);
foreach (Record r in leftRecords) {
if (checkDirectory is null && r.Size == 0 && r.Ticks == 0) {
checkDirectory = r.RelativePath;
continue;
}
if (r.RelativePath == rightDirectory || r.RelativePath == fileName)
continue;
if (!keyValuePairs.TryGetValue(r.RelativePath, out record))
continue;
totalSeconds = new TimeSpan(record.Ticks - r.Ticks).TotalSeconds;
if (record.Size != r.Size || totalSeconds is > 2 or < -2)
continue;
segment = new(Left: r,
LeftDirectory: checkDirectory,
Right: record,
RightDirectory: rightDirectory,
RootUniformResourceLocator: rootUniformResourceLocators);
results.Add(segment);
}
return results.AsReadOnly();
}
private static ReadOnlyDictionary<string, Record> GetKeyValuePairs(ReadOnlyCollection<Record> records) {
Dictionary<string, Record> results = [];
foreach (Record record in records)
results.Add(record.RelativePath, record);
return new(results);
}
private static ReadOnlyCollection<Segment> GetNotEqualBut(string rightDirectory, string fileName, ReadOnlyCollection<Record> rightRecords, string rootUniformResourceLocators, ReadOnlyCollection<Record> leftRecords) {
List<Segment> results = [];
Record? record;
Segment segment;
double totalSeconds;
string? checkDirectory = null;
ReadOnlyDictionary<string, Record> keyValuePairs = GetKeyValuePairs(rightRecords);
foreach (Record r in leftRecords) {
if (checkDirectory is null && r.Size == 0 && r.Ticks == 0) {
checkDirectory = r.RelativePath;
continue;
}
if (r.RelativePath == rightDirectory || r.RelativePath == fileName)
continue;
if (!keyValuePairs.TryGetValue(r.RelativePath, out record))
continue;
if (record.Size == r.Size)
continue;
totalSeconds = new TimeSpan(record.Ticks - r.Ticks).TotalSeconds;
if (totalSeconds is >= 2 or <= -2)
continue;
segment = new(Left: r,
LeftDirectory: checkDirectory,
Right: record,
RightDirectory: rightDirectory,
RootUniformResourceLocator: rootUniformResourceLocators);
results.Add(segment);
}
return results.AsReadOnly();
}
private static ReadOnlyCollection<Segment> GetLeftSideOnly(string rightDirectory, string fileName, ReadOnlyCollection<Record> rightRecords, string rootUniformResourceLocators, ReadOnlyCollection<Record> leftRecords) {
List<Segment> results = [];
Record? record;
Segment segment;
string? checkDirectory = null;
ReadOnlyDictionary<string, Record> keyValuePairs = GetKeyValuePairs(rightRecords);
foreach (Record r in leftRecords) {
if (checkDirectory is null && r.Size == 0 && r.Ticks == 0) {
checkDirectory = r.RelativePath;
continue;
}
if (r.RelativePath == rightDirectory || r.RelativePath == fileName)
continue;
if (keyValuePairs.TryGetValue(r.RelativePath, out record))
continue;
segment = new(Left: r,
LeftDirectory: checkDirectory,
Right: record,
RightDirectory: rightDirectory,
RootUniformResourceLocator: rootUniformResourceLocators);
results.Add(segment);
}
return results.AsReadOnly();
}
private static ReadOnlyCollection<Segment> GetRightSideOnly(string rightDirectory, string fileName, ReadOnlyCollection<Record> rightRecords, string rootUniformResourceLocators, ReadOnlyCollection<Record> leftRecords) {
List<Segment> results = [];
Record? record;
Segment segment;
string? checkDirectory = null;
ReadOnlyDictionary<string, Record> keyValuePairs = GetKeyValuePairs(leftRecords);
foreach (Record r in rightRecords) {
if (checkDirectory is null && r.Size == 0 && r.Ticks == 0) {
checkDirectory = r.RelativePath;
continue;
}
if (r.RelativePath == rightDirectory || r.RelativePath == fileName)
continue;
if (keyValuePairs.TryGetValue(r.RelativePath, out record))
continue;
segment = new(Left: record,
LeftDirectory: null,
Right: r,
RightDirectory: rightDirectory,
RootUniformResourceLocator: rootUniformResourceLocators);
results.Add(segment);
}
return results.AsReadOnly();
}
private static ReadOnlyCollection<Segment> GetLeftSideIsNewer(string rightDirectory, string fileName, ReadOnlyCollection<Record> rightRecords, string rootUniformResourceLocators, ReadOnlyCollection<Record> leftRecords) {
List<Segment> results = [];
Record? record;
Segment segment;
double totalSeconds;
string? checkDirectory = null;
ReadOnlyDictionary<string, Record> keyValuePairs = GetKeyValuePairs(rightRecords);
foreach (Record r in leftRecords) {
if (checkDirectory is null && r.Size == 0 && r.Ticks == 0) {
checkDirectory = r.RelativePath;
continue;
}
if (r.RelativePath == rightDirectory || r.RelativePath == fileName)
continue;
if (!keyValuePairs.TryGetValue(r.RelativePath, out record))
continue;
totalSeconds = new TimeSpan(record.Ticks - r.Ticks).TotalSeconds;
if (totalSeconds is > -2)
continue;
segment = new(Left: r,
LeftDirectory: checkDirectory,
Right: record,
RightDirectory: rightDirectory,
RootUniformResourceLocator: rootUniformResourceLocators);
results.Add(segment);
}
return results.AsReadOnly();
}
private static ReadOnlyCollection<Segment> GetRightSideIsNewer(string rightDirectory, string fileName, ReadOnlyCollection<Record> rightRecords, string rootUniformResourceLocators, ReadOnlyCollection<Record> leftRecords) {
List<Segment> results = [];
Record? record;
Segment segment;
double totalSeconds;
string? checkDirectory = null;
ReadOnlyDictionary<string, Record> keyValuePairs = GetKeyValuePairs(leftRecords);
foreach (Record r in rightRecords) {
if (checkDirectory is null && r.Size == 0 && r.Ticks == 0) {
checkDirectory = r.RelativePath;
continue;
}
if (r.RelativePath == rightDirectory || r.RelativePath == fileName)
continue;
if (!keyValuePairs.TryGetValue(r.RelativePath, out record))
continue;
totalSeconds = new TimeSpan(record.Ticks - r.Ticks).TotalSeconds;
if (totalSeconds is > -2)
continue;
segment = new(Left: record,
LeftDirectory: null,
Right: r,
RightDirectory: rightDirectory,
RootUniformResourceLocator: rootUniformResourceLocators);
results.Add(segment);
}
return results.AsReadOnly();
}
private static void DoWork(ILogger<Worker> logger, string rightDirectory, HttpClient httpClient, ReadOnlyCollection<Segment> segments, bool delete, bool download) {
long sum;
Record[] records = (from l in segments where l.Left is not null select l.Left).ToArray();
try { sum = records.Sum(l => l.Size); } catch (Exception) { sum = 0; }
string size = GetSizeWithSuffix(sum);
if (delete) {
logger.LogInformation("Starting to delete {count} file(s) [{sum}]", segments.Count, size);
DoDeletes(logger, rightDirectory, segments);
logger.LogInformation("Deleted {count} file(s) [{sum}]", segments.Count, size);
}
if (download) {
logger.LogInformation("Starting to download {count} file(s) [{sum}]", segments.Count, size);
DoDownloads(logger, rightDirectory, segments, httpClient);
logger.LogInformation("Downloaded {count} file(s) [{sum}]", segments.Count, size);
}
}
private static string GetSizeWithSuffix(long value) {
string result;
int i = 0;
string[] SizeSuffixes = ["bytes", "KB", "MB", "GB", "TB", "PB", "EB", "ZB", "YB"];
if (value < 0) {
result = "-" + GetSizeWithSuffix(-value);
} else {
while (Math.Round(value / 1024f) >= 1) {
value /= 1024;
i++;
}
result = string.Format("{0:n1} {1}", value, SizeSuffixes[i]);
}
return result;
}
private static string GetDurationWithSuffix(long ticks) {
string result;
TimeSpan timeSpan = new(DateTime.Now.Ticks - ticks);
if (timeSpan.TotalMilliseconds < 1000)
result = $"{timeSpan.Milliseconds} ms";
else if (timeSpan.TotalMilliseconds < 60000)
result = $"{Math.Floor(timeSpan.TotalSeconds)} s";
else if (timeSpan.TotalMilliseconds < 3600000)
result = $"{Math.Floor(timeSpan.TotalMinutes)} m";
else
result = $"{Math.Floor(timeSpan.TotalHours)} h";
return result;
}
private static void DoDeletes(ILogger<Worker> logger, string rightDirectory, ReadOnlyCollection<Segment> segments) {
Record? record;
string size;
string count = segments.Count.ToString("000000");
#if ShellProgressBar
ProgressBar progressBar = new(segments.Count, $"Deleting: {count};", new ProgressBarOptions() { ProgressCharacter = '─', ProgressBarOnBottom = true, DisableBottomPercentage = true });
#endif
for (int i = 0; i < segments.Count; i++) {
#if ShellProgressBar
progressBar.Tick();
#endif
record = segments[i].Right;
if (record is null)
continue;
size = GetSizeWithSuffix(record.Size);
try {
File.Delete(Path.Combine(rightDirectory, record.RelativePath));
logger.LogInformation("{i} of {count} - Deleted: <{RelativePath}> - {size};", i.ToString("000000"), count, record.RelativePath, size);
} catch (Exception) {
logger.LogInformation("Failed to delete: <{RelativePath}> - {size};", record.RelativePath, size);
}
}
#if ShellProgressBar
progressBar.Dispose();
#endif
}
private static void DoDownloads(ILogger<Worker> logger, string rightDirectory, ReadOnlyCollection<Segment> segments, HttpClient httpClient) {
int i = 0;
long ticks;
string size;
string duration;
DateTime dateTime;
Task<string> response;
string count = segments.Count.ToString("000000");
ReadOnlyCollection<Download> downloads = GetDownloads(rightDirectory, segments);
Task<HttpResponseMessage> httpResponseMessage;
#if ShellProgressBar
ProgressBar progressBar = new(downloads.Count, $"Downloading: {count};", new ProgressBarOptions() { ProgressCharacter = '─', ProgressBarOnBottom = true, DisableBottomPercentage = true });
#endif
foreach (Download download in downloads) {
#if ShellProgressBar
progressBar.Tick();
#endif
i += 1;
ticks = DateTime.Now.Ticks;
size = GetSizeWithSuffix(download.Size);
httpResponseMessage = httpClient.GetAsync(download.UniformResourceLocator);
httpResponseMessage.Wait(-1);
if (!httpResponseMessage.Result.IsSuccessStatusCode)
logger.LogInformation("Failed to download: <{checkURL}> - {size};", download.UniformResourceLocator, size);
else {
response = httpResponseMessage.Result.Content.ReadAsStringAsync();
response.Wait();
try {
File.WriteAllText(download.File, response.Result);
duration = GetDurationWithSuffix(ticks);
dateTime = new DateTime(download.Ticks).ToLocalTime();
File.SetLastWriteTime(download.File, dateTime);
logger.LogInformation("{i} of {count} - Downloaded: <{checkURL}> - {size} - {timeSpan};",
i.ToString("000000"),
count,
download.Display,
size,
duration);
} catch (Exception) {
logger.LogInformation("Failed to download: <{checkURL}> - {size};", download.UniformResourceLocator, size);
}
}
}
#if ShellProgressBar
progressBar.Dispose();
#endif
}
private static ReadOnlyCollection<Download> GetDownloads(string rightDirectory, ReadOnlyCollection<Segment> segments) {
List<Download> results = [];
string checkFile;
Download download;
string? checkDirectory;
List<Download> collection = [];
string? checkUniformResourceLocator;
foreach (Segment segment in segments) {
if (segment.Left is null)
continue;
checkFile = Path.Combine(rightDirectory, segment.Left.RelativePath);
checkDirectory = Path.GetDirectoryName(checkFile);
if (string.IsNullOrEmpty(checkDirectory))
continue;
if (!Directory.Exists(checkDirectory))
_ = Directory.CreateDirectory(checkDirectory);
if (File.Exists(checkFile) && new FileInfo(checkFile).Length == 0)
File.Delete(checkFile);
checkUniformResourceLocator = ConvertTo(segment.RootUniformResourceLocator, segment.Left.RelativePath);
if (string.IsNullOrEmpty(checkUniformResourceLocator))
continue;
download = new(Directory: checkDirectory,
Display: checkUniformResourceLocator[segment.RootUniformResourceLocator.Length..],
File: checkFile,
Size: segment.Left.Size,
Ticks: segment.Left.Ticks,
UniformResourceLocator: checkUniformResourceLocator);
collection.Add(download);
}
Download[] sorted = (from l in collection orderby l.Size select l).ToArray();
int stop = sorted.Length < 100 ? sorted.Length : 100;
for (int i = 0; i < stop; i++)
results.Add(sorted[i]);
for (int i = sorted.Length - 1; i > stop - 1; i--)
results.Add(sorted[i]);
if (collection.Count != results.Count)
throw new Exception();
return results.AsReadOnly();
}
private static string? ConvertTo(string rootURL, string relativePath) {
string? result = rootURL.EndsWith('/') ? rootURL[..^1] : rootURL;
string windowsRoot = "c:\\";
string windowsMock = $"{windowsRoot}{relativePath}";
string fileName = Path.GetFileName(windowsMock);
ReadOnlyCollection<string> directoryNames = Helpers.HelperDirectory.GetDirectoryNames(windowsMock);
foreach (string directoryName in directoryNames) {
if (directoryName == windowsRoot || directoryName == fileName)
continue;
result = $"{result}/{directoryName}";
}
result = result == rootURL ? null : $"{result}/{fileName}";
return result;
}
}

View File

@ -0,0 +1,47 @@
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI5;
internal static partial class Helper20250421 {
internal static void FreeFileSyncChangeCreatedDate(ILogger<Worker> logger, List<string> args) {
string searchPattern = args[2];
string[] searchPatterns = args[3].Split('~');
string sourceDirectory = Path.GetFullPath(args[0]);
if (searchPatterns.Length != 2)
throw new NotImplementedException($"Not the correct number of {searchPatterns} were passed!");
string lastSyncSearch = $"{searchPatterns[0]}=\"";
string configurationFileSearch = $"{searchPatterns[1]}=\"";
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.AllDirectories);
if (files.Length != 1)
logger.LogWarning("<{files}>(s)", files.Length);
else
ChangeCreatedDate(lastSyncSearch, configurationFileSearch, files[0]);
}
private static void ChangeCreatedDate(string lastSyncSearch, string configurationFileSearch, string sourceFile) {
long epoch;
string lastSync;
string[] segments;
string[] segmentsB;
DateTime creationTime;
string configurationFile;
string[] lines = File.ReadAllLines(sourceFile);
foreach (string line in lines) {
segments = line.Split(lastSyncSearch);
if (segments.Length != 2)
continue;
segmentsB = line.Split(configurationFileSearch);
if (segmentsB.Length != 2)
continue;
lastSync = segments[1].Split('"')[0];
if (!long.TryParse(lastSync, out epoch) || epoch == 0)
continue;
configurationFile = segmentsB[1].Split('"')[0];
if (!File.Exists(configurationFile))
continue;
creationTime = new(DateTimeOffset.UnixEpoch.AddSeconds(epoch).ToLocalTime().Ticks);
File.SetCreationTime(configurationFile, creationTime);
}
}
}

View File

@ -0,0 +1,72 @@
using System.Collections.ObjectModel;
using System.Text.Json;
using File_Folder_Helper.Models;
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI5;
internal static partial class Helper20250429 {
private record Record(string Directory, string File, bool FileExists);
internal static void WriteNginxFileSystem(ILogger<Worker> logger, List<string> args) {
string searchPattern = args[2];
string sourceDirectory = Path.GetFullPath(args[0]);
ReadOnlyCollection<Record> subDirectories = GetSubDirectories(searchPattern, sourceDirectory);
if (subDirectories.Count == 0)
logger.LogWarning("<{results}>(s)", subDirectories.Count);
else
WriteNginxFileSystem(searchPattern, subDirectories);
}
private static ReadOnlyCollection<Record> GetSubDirectories(string searchPattern, string sourceDirectory) {
List<Record> results = [];
bool exists;
Record record;
string checkFile;
string[] subDirectories;
string[] directories = Directory.GetDirectories(sourceDirectory, "*", SearchOption.TopDirectoryOnly);
foreach (string directory in directories) {
subDirectories = Directory.GetDirectories(directory, "*", SearchOption.TopDirectoryOnly);
foreach (string subDirectory in subDirectories) {
checkFile = Path.Combine(subDirectory, $"{searchPattern.Split('*')[^1]}.json");
exists = File.Exists(checkFile);
record = new(Directory: subDirectory, File: checkFile, FileExists: exists);
results.Add(record);
}
}
return results.OrderByDescending(l => l.FileExists).ToArray().AsReadOnly();
}
private static void WriteNginxFileSystem(string searchPattern, ReadOnlyCollection<Record> subDirectories) {
string lines;
string result;
string[] files;
FileInfo fileInfo;
List<string> results = [];
NginxFileSystem nginxFileSystem;
foreach (Record record in subDirectories) {
results.Clear();
files = Directory.GetFiles(record.Directory, searchPattern, SearchOption.AllDirectories);
foreach (string file in files) {
fileInfo = new(file);
nginxFileSystem = new(Name: fileInfo.FullName,
LastModified: null,
MTime: fileInfo.LastWriteTime.ToUniversalTime().ToString(),
URI: null,
Type: "file",
Length: fileInfo.Length);
results.Add(JsonSerializer.Serialize(nginxFileSystem, NginxFileSystemSingleLineSourceGenerationContext.Default.NginxFileSystem));
}
if (results.Count == 0)
continue;
result = $"[{Environment.NewLine}{string.Join($",{Environment.NewLine}", results)}{Environment.NewLine}]";
lines = !record.FileExists ? string.Empty : File.ReadAllText(record.File);
if (result == lines)
continue;
File.WriteAllText(record.File, result);
}
}
}

View File

@ -0,0 +1,84 @@
#if Html2pdf
using iText.Html2pdf;
#endif
using Microsoft.Extensions.Logging;
#if Selenium
using OpenQA.Selenium;
using OpenQA.Selenium.Edge;
#endif
namespace File_Folder_Helper.ADO2025.PI5;
internal static partial class Helper20250505 {
// <PackageReference Include="Selenium.WebDriver" Version="4.31.0" />
// <PackageReference Include="iText" Version="9.1.0" />
// <PackageReference Include="iText.bouncy-castle-adapter" Version="9.1.0" />
// <PackageReference Include="iText.commons" Version="9.1.0" />
// <PackageReference Include="iText.hyph" Version="9.1.0" />
// <PackageReference Include="iText.pdfhtml" Version="6.1.0" />
internal static void HyperTextMarkupLanguageToPortableDocumentFormat(ILogger<Worker> logger, List<string> args) {
if (args.Count == 999)
TestA();
if (args.Count == 999)
TestB();
if (args.Count != 999)
TestC(logger);
}
private static void TestA() {
#if Html2pdf
string inputFile = Path.Combine(Environment.CurrentDirectory, ".vscode", "helper", ".html");
string outputFile = Path.Combine(Environment.CurrentDirectory, ".vscode", "helper", "a.pdf");
using (FileStream htmlSource = File.Open(inputFile, FileMode.Open))
using (FileStream pdfDest = File.Open(outputFile, FileMode.Create)) {
ConverterProperties converterProperties = new();
HtmlConverter.ConvertToPdf(htmlSource, pdfDest, converterProperties);
}
#endif
}
private static void TestB() {
#if Html2pdf
HttpClient httpClient = new();
Task<Stream> stream = httpClient.GetStreamAsync("https://ourrescue.org/");
stream.Wait();
string outputFile = Path.Combine(Environment.CurrentDirectory, ".vscode", "helper", "b.pdf");
using (FileStream pdfDest = File.Open(outputFile, FileMode.Create)) {
ConverterProperties converterProperties = new();
HtmlConverter.ConvertToPdf(stream.Result, pdfDest, converterProperties);
}
#endif
}
private static void TestC(ILogger<Worker> logger) {
#if Selenium
EdgeOptions edgeOptions = new();
edgeOptions.AddArgument("--no-sandbox");
edgeOptions.AddArgument("--disable-gpu");
edgeOptions.AddArgument("--headless=new");
edgeOptions.AddArgument("--start-maximized");
edgeOptions.AddArgument("--profile-directory=Default");
edgeOptions.AddArgument("--browser-version 133.0.3065.82");
EdgeDriver edgeDriver = new(edgeOptions);
string outputFile = Path.Combine(Environment.CurrentDirectory, ".vscode", "helper", ".png");
try {
// edgeDriver.Navigate().GoToUrl("https://ourrescue.org/");
// edgeDriver.Navigate().GoToUrl("https://intranet.infineon.com/");
edgeDriver.Navigate().GoToUrl("https://messa020ec.infineon.com:50205/ProductionReport/DailyReport");
int fullWidth = int.Parse(edgeDriver.ExecuteScript("return document.body.parentNode.scrollWidth").ToString());
int fullHeight = int.Parse(edgeDriver.ExecuteScript("return document.body.parentNode.scrollHeight").ToString());
edgeDriver.Manage().Window.Size = new(fullWidth, fullHeight);
Screenshot screenshot = edgeDriver.GetScreenshot();
screenshot.SaveAsFile(outputFile);
} catch (Exception ex) {
logger.LogError(ex, ex.Message);
}
edgeDriver.Close();
#endif
}
}

381
ADO2025/PI6/.editorconfig Normal file
View File

@ -0,0 +1,381 @@
[*.md]
end_of_line = crlf
file_header_template = unset
indent_size = 2
indent_style = space
insert_final_newline = false
root = true
tab_width = 2
[*.csproj]
end_of_line = crlf
file_header_template = unset
indent_size = 2
indent_style = space
insert_final_newline = false
root = true
tab_width = 2
[*.cs]
csharp_indent_block_contents = true
csharp_indent_braces = false
csharp_indent_case_contents = true
csharp_indent_case_contents_when_block = true
csharp_indent_labels = one_less_than_current
csharp_indent_switch_labels = true
csharp_new_line_before_catch = false
csharp_new_line_before_else = false
csharp_new_line_before_finally = false
csharp_new_line_before_members_in_anonymous_types = true
csharp_new_line_before_members_in_object_initializers = true
csharp_new_line_before_open_brace = none
csharp_new_line_between_query_expression_clauses = true
csharp_prefer_braces = false
csharp_prefer_qualified_reference = true:error
csharp_prefer_simple_default_expression = true:warning
csharp_prefer_simple_using_statement = true:warning
csharp_prefer_static_local_function = true:warning
csharp_preferred_modifier_order = public,private,protected,internal,static,extern,new,virtual,abstract,sealed,override,readonly,unsafe,volatile,async
csharp_preserve_single_line_blocks = true
csharp_preserve_single_line_statements = false
csharp_space_after_cast = false
csharp_space_after_colon_in_inheritance_clause = true
csharp_space_after_comma = true
csharp_space_after_dot = false
csharp_space_after_keywords_in_control_flow_statements = true
csharp_space_after_semicolon_in_for_statement = true
csharp_space_around_binary_operators = before_and_after
csharp_space_around_declaration_statements = false
csharp_space_before_colon_in_inheritance_clause = true
csharp_space_before_comma = false
csharp_space_before_dot = false
csharp_space_before_open_square_brackets = false
csharp_space_before_semicolon_in_for_statement = false
csharp_space_between_empty_square_brackets = false
csharp_space_between_method_call_empty_parameter_list_parentheses = false
csharp_space_between_method_call_name_and_opening_parenthesis = false
csharp_space_between_method_call_parameter_list_parentheses = false
csharp_space_between_method_declaration_empty_parameter_list_parentheses = false
csharp_space_between_method_declaration_name_and_open_parenthesis = false
csharp_space_between_method_declaration_parameter_list_parentheses = false
csharp_space_between_parentheses = false
csharp_space_between_square_brackets = false
csharp_style_allow_blank_line_after_colon_in_constructor_initializer_experimental = true
csharp_style_allow_blank_line_after_token_in_arrow_expression_clause_experimental = true
csharp_style_allow_blank_line_after_token_in_conditional_expression_experimental = true
csharp_style_allow_blank_lines_between_consecutive_braces_experimental = false
csharp_style_allow_blank_lines_between_consecutive_braces_experimental = true
csharp_style_allow_embedded_statements_on_same_line_experimental = true
csharp_style_conditional_delegate_call = true
csharp_style_deconstructed_variable_declaration = false
csharp_style_expression_bodied_accessors = when_on_single_line:warning
csharp_style_expression_bodied_constructors = when_on_single_line:warning
csharp_style_expression_bodied_indexers = when_on_single_line:warning
csharp_style_expression_bodied_lambdas = when_on_single_line:warning
csharp_style_expression_bodied_local_functions = when_on_single_line:warning
csharp_style_expression_bodied_methods = when_on_single_line:warning
csharp_style_expression_bodied_operators = when_on_single_line:warning
csharp_style_expression_bodied_properties = when_on_single_line:warning
csharp_style_implicit_object_creation_when_type_is_apparent = true:warning
csharp_style_inlined_variable_declaration = false
csharp_style_namespace_declarations = file_scoped:warning
csharp_style_pattern_local_over_anonymous_function = true:warning
csharp_style_pattern_matching_over_as_with_null_check = true:warning
csharp_style_pattern_matching_over_is_with_cast_check = true:warning
csharp_style_prefer_index_operator = true:warning
csharp_style_prefer_not_pattern = true:warning
csharp_style_prefer_null_check_over_type_check = true
csharp_style_prefer_pattern_matching = true:warning
csharp_style_prefer_range_operator = true:warning
csharp_style_prefer_switch_expression = true:warning
csharp_style_throw_expression = true
csharp_style_unused_value_assignment_preference = discard_variable:warning
csharp_style_unused_value_expression_statement_preference = discard_variable:warning
csharp_style_var_elsewhere = false:warning
csharp_style_var_for_built_in_types = false:warning
csharp_style_var_when_type_is_apparent = false:warning
csharp_using_directive_placement = outside_namespace
dotnet_analyzer_diagnostic.category-Design.severity = error
dotnet_analyzer_diagnostic.category-Documentation.severity = error
dotnet_analyzer_diagnostic.category-Globalization.severity = none
dotnet_analyzer_diagnostic.category-Interoperability.severity = error
dotnet_analyzer_diagnostic.category-Maintainability.severity = error
dotnet_analyzer_diagnostic.category-Naming.severity = none
dotnet_analyzer_diagnostic.category-Performance.severity = none
dotnet_analyzer_diagnostic.category-Reliability.severity = error
dotnet_analyzer_diagnostic.category-Security.severity = error
dotnet_analyzer_diagnostic.category-SingleFile.severity = error
dotnet_analyzer_diagnostic.category-Style.severity = error
dotnet_analyzer_diagnostic.category-Usage.severity = error
dotnet_code_quality_unused_parameters = all
dotnet_code_quality_unused_parameters = non_public
dotnet_code_quality.CAXXXX.api_surface = private, internal
dotnet_diagnostic.CA1001.severity = error # CA1001: Types that own disposable fields should be disposable
dotnet_diagnostic.CA1051.severity = error # CA1051: Do not declare visible instance fields
dotnet_diagnostic.CA1511.severity = warning # CA1511: Use 'ArgumentException.ThrowIfNullOrEmpty' instead of explicitly throwing a new exception instance
dotnet_diagnostic.CA1513.severity = warning # Use 'ObjectDisposedException.ThrowIf' instead of explicitly throwing a new exception instance
dotnet_diagnostic.CA1825.severity = warning # CA1825: Avoid zero-length array allocations
dotnet_diagnostic.CA1829.severity = error # CA1829: Use Length/Count property instead of Count() when available
dotnet_diagnostic.CA1834.severity = warning # CA1834: Consider using 'StringBuilder.Append(char)' when applicable
dotnet_diagnostic.CA1860.severity = error # CA1860: Prefer comparing 'Count' to 0 rather than using 'Any()', both for clarity and for performance
dotnet_diagnostic.CA1862.severity = warning # CA1862: Prefer using 'string.Equals(string, StringComparison)' to perform a case-insensitive comparison, but keep in mind that this might cause subtle changes in behavior, so make sure to conduct thorough testing after applying the suggestion, or if culturally sensitive comparison is not required, consider using 'StringComparison.OrdinalIgnoreCase'
dotnet_diagnostic.CA1869.severity = none # CA1869: Avoid creating a new 'JsonSerializerOptions' instance for every serialization operation. Cache and reuse instances instead.
dotnet_diagnostic.CA2201.severity = none # CA2201: Exception type System.NullReferenceException is reserved by the runtime
dotnet_diagnostic.CA2254.severity = none # CA2254: The logging message template should not vary between calls to 'LoggerExtensions.LogInformation(ILogger, string?, params object?[])'
dotnet_diagnostic.IDE0001.severity = warning # IDE0001: Simplify name
dotnet_diagnostic.IDE0002.severity = warning # Simplify (member access) - System.Version.Equals("1", "2"); Version.Equals("1", "2");
dotnet_diagnostic.IDE0004.severity = warning # IDE0004: Cast is redundant.
dotnet_diagnostic.IDE0005.severity = error # Using directive is unnecessary
dotnet_diagnostic.IDE0010.severity = none # Add missing cases to switch statement (IDE0010)
dotnet_diagnostic.IDE0028.severity = error # IDE0028: Collection initialization can be simplified
dotnet_diagnostic.IDE0031.severity = warning # Use null propagation (IDE0031)
dotnet_diagnostic.IDE0047.severity = warning # IDE0047: Parentheses can be removed
dotnet_diagnostic.IDE0048.severity = none # Parentheses preferences (IDE0047 and IDE0048)
dotnet_diagnostic.IDE0049.severity = warning # Use language keywords instead of framework type names for type references (IDE0049)
dotnet_diagnostic.IDE0051.severity = error # Private member '' is unused [, ]
dotnet_diagnostic.IDE0058.severity = error # IDE0058: Expression value is never used
dotnet_diagnostic.IDE0060.severity = error # IDE0060: Remove unused parameter
dotnet_diagnostic.IDE0074.severity = warning # IDE0074: Use compound assignment
dotnet_diagnostic.IDE0130.severity = none # Namespace does not match folder structure (IDE0130)
dotnet_diagnostic.IDE0270.severity = warning # IDE0270: Null check can be simplified
dotnet_diagnostic.IDE0290.severity = none # Use primary constructor [Distance]csharp(IDE0290)
dotnet_diagnostic.IDE0300.severity = error # IDE0300: Collection initialization can be simplified
dotnet_diagnostic.IDE0301.severity = error #IDE0301: Collection initialization can be simplified
dotnet_diagnostic.IDE0305.severity = none # IDE0305: Collection initialization can be simplified
dotnet_diagnostic.IDE2000.severity = error # IDE2000: Allow multiple blank lines
dotnet_naming_rule.abstract_method_should_be_pascal_case.severity = warning
dotnet_naming_rule.abstract_method_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.abstract_method_should_be_pascal_case.symbols = abstract_method
dotnet_naming_rule.class_should_be_pascal_case.severity = warning
dotnet_naming_rule.class_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.class_should_be_pascal_case.symbols = class
dotnet_naming_rule.delegate_should_be_pascal_case.severity = warning
dotnet_naming_rule.delegate_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.delegate_should_be_pascal_case.symbols = delegate
dotnet_naming_rule.enum_should_be_pascal_case.severity = warning
dotnet_naming_rule.enum_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.enum_should_be_pascal_case.symbols = enum
dotnet_naming_rule.event_should_be_pascal_case.severity = warning
dotnet_naming_rule.event_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.event_should_be_pascal_case.symbols = event
dotnet_naming_rule.interface_should_be_begins_with_i.severity = warning
dotnet_naming_rule.interface_should_be_begins_with_i.style = begins_with_i
dotnet_naming_rule.interface_should_be_begins_with_i.symbols = interface
dotnet_naming_rule.method_should_be_pascal_case.severity = warning
dotnet_naming_rule.method_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.method_should_be_pascal_case.symbols = method
dotnet_naming_rule.non_field_members_should_be_pascal_case.severity = warning
dotnet_naming_rule.non_field_members_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.non_field_members_should_be_pascal_case.symbols = non_field_members
dotnet_naming_rule.private_method_should_be_pascal_case.severity = warning
dotnet_naming_rule.private_method_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.private_method_should_be_pascal_case.symbols = private_method
dotnet_naming_rule.private_or_internal_field_should_be_private_of_internal_field.severity = warning
dotnet_naming_rule.private_or_internal_field_should_be_private_of_internal_field.style = private_of_internal_field
dotnet_naming_rule.private_or_internal_field_should_be_private_of_internal_field.symbols = private_or_internal_field
dotnet_naming_rule.private_or_internal_static_field_should_be_private_of_internal_field.severity = warning
dotnet_naming_rule.private_or_internal_static_field_should_be_private_of_internal_field.style = private_of_internal_field
dotnet_naming_rule.private_or_internal_static_field_should_be_private_of_internal_field.symbols = private_or_internal_static_field
dotnet_naming_rule.property_should_be_pascal_case.severity = warning
dotnet_naming_rule.property_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.property_should_be_pascal_case.symbols = property
dotnet_naming_rule.public_or_protected_field_should_be_private_of_internal_field.severity = warning
dotnet_naming_rule.public_or_protected_field_should_be_private_of_internal_field.style = private_of_internal_field
dotnet_naming_rule.public_or_protected_field_should_be_private_of_internal_field.symbols = public_or_protected_field
dotnet_naming_rule.static_field_should_be_pascal_case.severity = warning
dotnet_naming_rule.static_field_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.static_field_should_be_pascal_case.symbols = static_field
dotnet_naming_rule.static_method_should_be_pascal_case.severity = warning
dotnet_naming_rule.static_method_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.static_method_should_be_pascal_case.symbols = static_method
dotnet_naming_rule.struct_should_be_pascal_case.severity = warning
dotnet_naming_rule.struct_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.struct_should_be_pascal_case.symbols = struct
dotnet_naming_rule.types_should_be_pascal_case.severity = warning
dotnet_naming_rule.types_should_be_pascal_case.style = pascal_case
dotnet_naming_rule.types_should_be_pascal_case.symbols = types
dotnet_naming_style.begins_with_i.capitalization = pascal_case
dotnet_naming_style.begins_with_i.required_prefix = I
dotnet_naming_style.begins_with_i.required_suffix =
dotnet_naming_style.begins_with_i.word_separator =
dotnet_naming_style.pascal_case.capitalization = pascal_case
dotnet_naming_style.pascal_case.required_prefix =
dotnet_naming_style.pascal_case.required_suffix =
dotnet_naming_style.pascal_case.word_separator =
dotnet_naming_style.private_of_internal_field.capitalization = pascal_case
dotnet_naming_style.private_of_internal_field.required_prefix = _
dotnet_naming_style.private_of_internal_field.required_suffix =
dotnet_naming_style.private_of_internal_field.word_separator =
dotnet_naming_symbols.abstract_method.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.abstract_method.applicable_kinds = method
dotnet_naming_symbols.abstract_method.required_modifiers = abstract
dotnet_naming_symbols.class.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.class.applicable_kinds = class
dotnet_naming_symbols.class.required_modifiers =
dotnet_naming_symbols.delegate.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.delegate.applicable_kinds = delegate
dotnet_naming_symbols.delegate.required_modifiers =
dotnet_naming_symbols.enum.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.enum.applicable_kinds = enum
dotnet_naming_symbols.enum.required_modifiers =
dotnet_naming_symbols.event.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.event.applicable_kinds = event
dotnet_naming_symbols.event.required_modifiers =
dotnet_naming_symbols.interface.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.interface.applicable_kinds = interface
dotnet_naming_symbols.interface.required_modifiers =
dotnet_naming_symbols.method.applicable_accessibilities = public
dotnet_naming_symbols.method.applicable_kinds = method
dotnet_naming_symbols.method.required_modifiers =
dotnet_naming_symbols.non_field_members.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.non_field_members.applicable_kinds = property, event, method
dotnet_naming_symbols.non_field_members.required_modifiers =
dotnet_naming_symbols.private_method.applicable_accessibilities = private
dotnet_naming_symbols.private_method.applicable_kinds = method
dotnet_naming_symbols.private_method.required_modifiers =
dotnet_naming_symbols.private_or_internal_field.applicable_accessibilities = internal, private, private_protected
dotnet_naming_symbols.private_or_internal_field.applicable_kinds = field
dotnet_naming_symbols.private_or_internal_field.required_modifiers =
dotnet_naming_symbols.private_or_internal_static_field.applicable_accessibilities = internal, private, private_protected
dotnet_naming_symbols.private_or_internal_static_field.applicable_kinds = field
dotnet_naming_symbols.private_or_internal_static_field.required_modifiers = static
dotnet_naming_symbols.property.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.property.applicable_kinds = property
dotnet_naming_symbols.property.required_modifiers =
dotnet_naming_symbols.public_or_protected_field.applicable_accessibilities = public, protected
dotnet_naming_symbols.public_or_protected_field.applicable_kinds = field
dotnet_naming_symbols.public_or_protected_field.required_modifiers =
dotnet_naming_symbols.static_field.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.static_field.applicable_kinds = field
dotnet_naming_symbols.static_field.required_modifiers = static
dotnet_naming_symbols.static_method.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.static_method.applicable_kinds = method
dotnet_naming_symbols.static_method.required_modifiers = static
dotnet_naming_symbols.struct.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.struct.applicable_kinds = struct
dotnet_naming_symbols.struct.required_modifiers =
dotnet_naming_symbols.types.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.types.applicable_kinds = class, struct, interface, enum
dotnet_naming_symbols.types.required_modifiers =
dotnet_remove_unnecessary_suppression_exclusions = 0
dotnet_separate_import_directive_groups = true
dotnet_sort_system_directives_first = true
dotnet_style_allow_multiple_blank_lines_experimental = false:warning
dotnet_style_allow_statement_immediately_after_block_experimental = true
dotnet_style_coalesce_expression = true
dotnet_style_collection_initializer = true:warning
dotnet_style_explicit_tuple_names = true:warning
dotnet_style_namespace_match_folder = true
dotnet_style_null_propagation = true:warning
dotnet_style_object_initializer = true:warning
dotnet_style_operator_placement_when_wrapping = beginning_of_line
dotnet_style_parentheses_in_arithmetic_binary_operators = always_for_clarity
dotnet_style_parentheses_in_other_binary_operators = always_for_clarity
dotnet_style_parentheses_in_other_operators = never_if_unnecessary
dotnet_style_parentheses_in_relational_binary_operators = always_for_clarity
dotnet_style_predefined_type_for_locals_parameters_members = true
dotnet_style_predefined_type_for_member_access = true:warning
dotnet_style_prefer_auto_properties = true:warning
dotnet_style_prefer_compound_assignment = true:warning
dotnet_style_prefer_conditional_expression_over_assignment = false
dotnet_style_prefer_conditional_expression_over_return = false
dotnet_style_prefer_inferred_anonymous_type_member_names = true:warning
dotnet_style_prefer_inferred_tuple_names = true:warning
dotnet_style_prefer_is_null_check_over_reference_equality_method = true:warning
dotnet_style_prefer_simplified_boolean_expressions = true:warning
dotnet_style_prefer_simplified_interpolation = true
dotnet_style_qualification_for_event = false:error
dotnet_style_qualification_for_field = false
dotnet_style_qualification_for_method = false:error
dotnet_style_qualification_for_property = false:error
dotnet_style_readonly_field = true:warning
dotnet_style_require_accessibility_modifiers = for_non_interface_members
end_of_line = crlf
file_header_template = unset
indent_size = 4
indent_style = space
insert_final_newline = false
root = true
tab_width = 4
# https://docs.microsoft.com/en-us/dotnet/fundamentals/code-analysis/quality-rules/ca1822
# https://github.com/dotnet/aspnetcore/blob/main/.editorconfig
# https://github.com/dotnet/project-system/blob/main/.editorconfig
# Question
csharp_prefer_simple_using_statement = false # Question
csharp_style_expression_bodied_constructors = when_on_single_line:none # Question
csharp_style_expression_bodied_properties = true # Question
csharp_style_implicit_object_creation_when_type_is_apparent = true:warning # Question
csharp_style_pattern_matching_over_as_with_null_check = false # Question
csharp_style_prefer_pattern_matching = false # Question
csharp_style_prefer_range_operator = false # Question
csharp_style_prefer_switch_expression = false # Question
csharp_style_unused_value_assignment_preference = unused_local_variable # Question
csharp_style_unused_value_expression_statement_preference = false # Question
csharp_style_var_elsewhere = false:none # Question
csharp_style_var_for_built_in_types = false:none # Question
csharp_style_var_when_type_is_apparent = false:warning # Question
dotnet_diagnostic.CA1001.severity = none # Question - Types that own disposable fields should be disposable
dotnet_diagnostic.CA1051.severity = none # Question - Do not declare visible instance fields
dotnet_diagnostic.CA1416.severity = none # Question - This call site is reachable on all platforms.
dotnet_diagnostic.CA1510.severity = none # Question - Use
dotnet_diagnostic.CA1834.severity = none # CA1834: Consider using 'StringBuilder.Append(char)' when applicable
dotnet_diagnostic.CA1860.severity = none # Question - Avoid using
dotnet_diagnostic.CA1862.severity = none # Question - Prefer using
dotnet_diagnostic.CA2208.severity = none # Question - Instantiate argument exceptions correctly
dotnet_diagnostic.CA2211.severity = none # Question - Non-constant fields should not be visible
dotnet_diagnostic.CA2249.severity = none # Question - Use
dotnet_diagnostic.CA2253.severity = none # Question - Named placeholders should not be numeric values
dotnet_diagnostic.CS0103.severity = none # Question - The name
dotnet_diagnostic.CS0168.severity = none # Question - The variable
dotnet_diagnostic.CS0219.severity = none # Question - The variable
dotnet_diagnostic.CS0612.severity = none # Question - is obsolete
dotnet_diagnostic.CS0618.severity = none # Question - Compiler Warning (level 2)
dotnet_diagnostic.CS0659.severity = none # Question - Compiler Warning (level 3)
dotnet_diagnostic.CS8019.severity = warning # Question - Unnecessary using directive.
dotnet_diagnostic.CS8600.severity = none # Question - Converting null literal or possible null value to non-nullable type
dotnet_diagnostic.CS8602.severity = none # Question - Dereference of a possibly null reference.
dotnet_diagnostic.CS8603.severity = none # Question - Possible null reference return
dotnet_diagnostic.CS8604.severity = none # Question - Possible null reference argument for parameter.
dotnet_diagnostic.CS8618.severity = none # Question - Non-nullable variable must contain a non-null value when exiting constructor
dotnet_diagnostic.CS8625.severity = none # Question - Cannot convert null literal to non-nullable reference type.
dotnet_diagnostic.CS8629.severity = none # Question - Nullable value type may be null
dotnet_diagnostic.CS8765.severity = none # Question - Nullability of type of parameter
dotnet_diagnostic.IDE0005.severity = none # Question - Remove unnecessary using directives
dotnet_diagnostic.IDE0008.severity = warning # Question - Use explicit type instead of
dotnet_diagnostic.IDE0017.severity = none # Question - Object initialization can be simplified
dotnet_diagnostic.IDE0019.severity = none # Question - Use pattern matching
dotnet_diagnostic.IDE0021.severity = none # Question - Use expression body for constructor
dotnet_diagnostic.IDE0022.severity = none # Question - Use expression body for method
dotnet_diagnostic.IDE0025.severity = none # Question - Use expression body for property
dotnet_diagnostic.IDE0027.severity = none # Question - Use expression body for accessor
dotnet_diagnostic.IDE0028.severity = none # Question - Use collection initializers or expressions
dotnet_diagnostic.IDE0031.severity = none # Question - Null check can be simplified
dotnet_diagnostic.IDE0032.severity = none # Question - Use auto property
dotnet_diagnostic.IDE0037.severity = none # Question - Member name can be simplified
dotnet_diagnostic.IDE0041.severity = none # Question - Null check can be simplified
dotnet_diagnostic.IDE0047.severity = none # Question - Parentheses preferences
dotnet_diagnostic.IDE0049.severity = warning # Question - Name can be simplified
dotnet_diagnostic.IDE0051.severity = none # Question - Remove unused private member
dotnet_diagnostic.IDE0053.severity = none # Question - Use expression body for lambdas
dotnet_diagnostic.IDE0054.severity = none # Question - Use compound assignment
dotnet_diagnostic.IDE0055.severity = none # Question - Formatting rule
dotnet_diagnostic.IDE0057.severity = none # Question - Substring can be simplified
dotnet_diagnostic.IDE0058.severity = none # Question - Remove unnecessary expression value
dotnet_diagnostic.IDE0059.severity = none # Question - Unnecessary assignment of a value to
dotnet_diagnostic.IDE0060.severity = none # Question - Remove unused parameter
dotnet_diagnostic.IDE0063.severity = none # Question - Use simple
dotnet_diagnostic.IDE0065.severity = none # Question -
dotnet_diagnostic.IDE0066.severity = none # Question - Use
dotnet_diagnostic.IDE0078.severity = none # Question - Use pattern matching (may change code meaning)
dotnet_diagnostic.IDE0090.severity = warning # Question - Simplify new expression
dotnet_diagnostic.IDE0100.severity = error # Question - Remove redundant equality
dotnet_diagnostic.IDE0160.severity = warning # Question - Use block-scoped namespace
dotnet_diagnostic.IDE0161.severity = warning # Question - Namespace declaration preferences
dotnet_diagnostic.IDE0270.severity = none # Question - Null check can be simplified
dotnet_diagnostic.IDE0300.severity = none # Question - Collection initialization can be simplified
dotnet_diagnostic.IDE1006.severity = none # Question - Use collection expression for builder dotnet_style_prefer_collection_expression
dotnet_style_null_propagation = false # Question
dotnet_style_object_initializer = false # Question
dotnet_style_prefer_auto_properties = false # Question
dotnet_style_allow_statement_immediately_after_block_experimental = true # Question
dotnet_style_prefer_inferred_anonymous_type_member_names = false:warning # Question
dotnet_style_prefer_is_null_check_over_reference_equality_method = false # Question

277
ADO2025/PI6/Envelope.cs Normal file
View File

@ -0,0 +1,277 @@
#nullable disable
#pragma warning disable CS8603
#pragma warning disable CS8632
#pragma warning disable IDE1006
namespace IFX.Shared.PasteSpecialXml.EAF.XML.API.Envelope;
// NOTE: Generated code may require at least .NET Framework 4.5 or .NET Core/Standard 2.0.
/// <remarks/>
[Serializable()]
[System.ComponentModel.DesignerCategory("code")]
[System.Xml.Serialization.XmlType(AnonymousType = true, Namespace = "http://www.w3.org/2003/05/soap-envelope")]
[System.Xml.Serialization.XmlRoot(Namespace = "http://www.w3.org/2003/05/soap-envelope", IsNullable = false)]
public partial class Envelope
{
private EnvelopeHeader? headerField;
private EnvelopeBody? bodyField;
/// <remarks/>
public EnvelopeHeader Header
{
get => this.headerField;
set => this.headerField = value;
}
/// <remarks/>
public EnvelopeBody Body
{
get => this.bodyField;
set => this.bodyField = value;
}
}
/// <remarks/>
[Serializable()]
[System.ComponentModel.DesignerCategory("code")]
[System.Xml.Serialization.XmlType(AnonymousType = true, Namespace = "http://www.w3.org/2003/05/soap-envelope")]
public partial class EnvelopeHeader
{
private Sequence? sequenceField;
private SequenceAcknowledgement? sequenceAcknowledgementField;
private Action? actionField;
private string? relatesToField;
/// <remarks/>
[System.Xml.Serialization.XmlElement(Namespace = "http://schemas.xmlsoap.org/ws/2005/02/rm")]
public Sequence Sequence
{
get => this.sequenceField;
set => this.sequenceField = value;
}
/// <remarks/>
[System.Xml.Serialization.XmlElement(Namespace = "http://schemas.xmlsoap.org/ws/2005/02/rm")]
public SequenceAcknowledgement SequenceAcknowledgement
{
get => this.sequenceAcknowledgementField;
set => this.sequenceAcknowledgementField = value;
}
/// <remarks/>
[System.Xml.Serialization.XmlElement(Namespace = "http://www.w3.org/2005/08/addressing")]
public Action Action
{
get => this.actionField;
set => this.actionField = value;
}
/// <remarks/>
[System.Xml.Serialization.XmlElement(Namespace = "http://www.w3.org/2005/08/addressing")]
public string RelatesTo
{
get => this.relatesToField;
set => this.relatesToField = value;
}
}
/// <remarks/>
[Serializable()]
[System.ComponentModel.DesignerCategory("code")]
[System.Xml.Serialization.XmlType(AnonymousType = true, Namespace = "http://schemas.xmlsoap.org/ws/2005/02/rm")]
[System.Xml.Serialization.XmlRoot(Namespace = "http://schemas.xmlsoap.org/ws/2005/02/rm", IsNullable = false)]
public partial class Sequence
{
private string? identifierField;
private byte messageNumberField;
private object? lastMessageField;
private byte mustUnderstandField;
/// <remarks/>
public string Identifier
{
get => this.identifierField;
set => this.identifierField = value;
}
/// <remarks/>
public byte MessageNumber
{
get => this.messageNumberField;
set => this.messageNumberField = value;
}
/// <remarks/>
public object LastMessage
{
get => this.lastMessageField;
set => this.lastMessageField = value;
}
/// <remarks/>
[System.Xml.Serialization.XmlAttribute(Form = System.Xml.Schema.XmlSchemaForm.Qualified, Namespace = "http://www.w3.org/2003/05/soap-envelope")]
public byte mustUnderstand
{
get => this.mustUnderstandField;
set => this.mustUnderstandField = value;
}
}
/// <remarks/>
[Serializable()]
[System.ComponentModel.DesignerCategory("code")]
[System.Xml.Serialization.XmlType(AnonymousType = true, Namespace = "http://schemas.xmlsoap.org/ws/2005/02/rm")]
[System.Xml.Serialization.XmlRoot(Namespace = "http://schemas.xmlsoap.org/ws/2005/02/rm", IsNullable = false)]
public partial class SequenceAcknowledgement
{
private string? identifierField;
private SequenceAcknowledgementAcknowledgementRange? acknowledgementRangeField;
private byte bufferRemainingField;
/// <remarks/>
public string Identifier
{
get => this.identifierField;
set => this.identifierField = value;
}
/// <remarks/>
public SequenceAcknowledgementAcknowledgementRange AcknowledgementRange
{
get => this.acknowledgementRangeField;
set => this.acknowledgementRangeField = value;
}
/// <remarks/>
[System.Xml.Serialization.XmlElement(Namespace = "http://schemas.microsoft.com/ws/2006/05/rm")]
public byte BufferRemaining
{
get => this.bufferRemainingField;
set => this.bufferRemainingField = value;
}
}
/// <remarks/>
[Serializable()]
[System.ComponentModel.DesignerCategory("code")]
[System.Xml.Serialization.XmlType(AnonymousType = true, Namespace = "http://schemas.xmlsoap.org/ws/2005/02/rm")]
public partial class SequenceAcknowledgementAcknowledgementRange
{
private byte lowerField;
private byte upperField;
/// <remarks/>
[System.Xml.Serialization.XmlAttribute()]
public byte Lower
{
get => this.lowerField;
set => this.lowerField = value;
}
/// <remarks/>
[System.Xml.Serialization.XmlAttribute()]
public byte Upper
{
get => this.upperField;
set => this.upperField = value;
}
}
/// <remarks/>
[Serializable()]
[System.ComponentModel.DesignerCategory("code")]
[System.Xml.Serialization.XmlType(AnonymousType = true, Namespace = "http://www.w3.org/2005/08/addressing")]
[System.Xml.Serialization.XmlRoot(Namespace = "http://www.w3.org/2005/08/addressing", IsNullable = false)]
public partial class Action
{
private byte mustUnderstandField;
private string? valueField;
/// <remarks/>
[System.Xml.Serialization.XmlAttribute(Form = System.Xml.Schema.XmlSchemaForm.Qualified, Namespace = "http://www.w3.org/2003/05/soap-envelope")]
public byte mustUnderstand
{
get => this.mustUnderstandField;
set => this.mustUnderstandField = value;
}
/// <remarks/>
[System.Xml.Serialization.XmlText()]
public string Value
{
get => this.valueField;
set => this.valueField = value;
}
}
/// <remarks/>
[Serializable()]
[System.ComponentModel.DesignerCategory("code")]
[System.Xml.Serialization.XmlType(AnonymousType = true, Namespace = "http://www.w3.org/2003/05/soap-envelope")]
public partial class EnvelopeBody
{
private CreateSequenceResponse? createSequenceResponseField;
/// <remarks/>
[System.Xml.Serialization.XmlElement(Namespace = "http://schemas.xmlsoap.org/ws/2005/02/rm")]
public CreateSequenceResponse CreateSequenceResponse
{
get => this.createSequenceResponseField;
set => this.createSequenceResponseField = value;
}
}
/// <remarks/>
[Serializable()]
[System.ComponentModel.DesignerCategory("code")]
[System.Xml.Serialization.XmlType(AnonymousType = true, Namespace = "http://schemas.xmlsoap.org/ws/2005/02/rm")]
[System.Xml.Serialization.XmlRoot(Namespace = "http://schemas.xmlsoap.org/ws/2005/02/rm", IsNullable = false)]
public partial class CreateSequenceResponse
{
private string? identifierField;
private CreateSequenceResponseAccept? acceptField;
/// <remarks/>
public string Identifier
{
get => this.identifierField;
set => this.identifierField = value;
}
/// <remarks/>
public CreateSequenceResponseAccept Accept
{
get => this.acceptField;
set => this.acceptField = value;
}
}
/// <remarks/>
[Serializable()]
[System.ComponentModel.DesignerCategory("code")]
[System.Xml.Serialization.XmlType(AnonymousType = true, Namespace = "http://schemas.xmlsoap.org/ws/2005/02/rm")]
public partial class CreateSequenceResponseAccept
{
private CreateSequenceResponseAcceptAcksTo? acksToField;
/// <remarks/>
public CreateSequenceResponseAcceptAcksTo AcksTo
{
get => this.acksToField;
set => this.acksToField = value;
}
}
/// <remarks/>
[Serializable()]
[System.ComponentModel.DesignerCategory("code")]
[System.Xml.Serialization.XmlType(AnonymousType = true, Namespace = "http://schemas.xmlsoap.org/ws/2005/02/rm")]
public partial class CreateSequenceResponseAcceptAcksTo
{
private string? addressField;
/// <remarks/>
[System.Xml.Serialization.XmlElement(Namespace = "http://www.w3.org/2005/08/addressing")]
public string Address
{
get => this.addressField;
set => this.addressField = value;
}
}

View File

@ -0,0 +1,531 @@
using System.Collections.ObjectModel;
using System.Text;
using System.Text.Json;
using System.Text.Json.Serialization;
using System.Web;
using Microsoft.Extensions.FileSystemGlobbing;
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI6;
internal static partial class Helper20250519 {
private record RelativePath(string LeftDirectory,
string? RightDirectory,
Record[] Records) {
public override string ToString() {
string result = JsonSerializer.Serialize(this, Helper20250519RelativePath.Default.RelativePath);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(RelativePath))]
private partial class Helper20250519RelativePath : JsonSerializerContext {
}
private record Review(Segment[]? AreEqual,
Segment[]? LeftSideIsNewer,
Segment[]? LeftSideOnly,
Segment[]? NotEqualBut,
Record[]? Records,
Segment[]? RightSideIsNewer,
Segment[]? RightSideOnly) {
public override string ToString() {
string result = JsonSerializer.Serialize(this, Helper20250519Review.Default.Review);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Review))]
private partial class Helper20250519Review : JsonSerializerContext {
}
private record Record(string RelativePath,
long Size,
long Ticks);
private record Segment(Record? Left,
Record? Right);
private record Verb(string Directory,
string Display,
string File,
string Multipart,
string RelativePath,
long Size,
long Ticks,
string UrlEncodedFile);
private record Input(string RightDirectory,
string LeftDirectory,
string IncludePatternsFile,
string ExcludePatternsFile,
string[] BaseAddresses,
string Page,
string[] Segments) {
private static string GetDirectory(List<string> args) =>
Path.GetFullPath(args[0].Split('~')[0]);
internal static Input Get(List<string> args) =>
new(RightDirectory: GetDirectory(args),
LeftDirectory: Path.GetFullPath(args[2].Split('~')[0]),
IncludePatternsFile: Path.Combine(GetDirectory(args), ".vscode", args[3]),
ExcludePatternsFile: Path.Combine(GetDirectory(args), ".vscode", args[4]),
BaseAddresses: args.Count < 5 ? [] : args[5].Split('~'),
Page: args[6],
Segments: args[9].Split('~'));
public override string ToString() {
string result = JsonSerializer.Serialize(this, Helper20250519Input.Default.Input);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Input))]
private partial class Helper20250519Input : JsonSerializerContext {
}
private record Logic(string Comment,
char GreaterThan,
bool? LeftSideIsNewer,
int LeftSideIsNewerIndex,
bool? LeftSideOnly,
int LeftSideOnlyIndex,
char LessThan,
char Minus,
bool? NotEqualBut,
int NotEqualButIndex,
char Plus,
string[] Raw,
bool? RightSideIsNewer,
int RightSideIsNewerIndex,
bool? RightSideOnly,
int RightSideOnlyIndex) {
internal static Logic? Get(string[] segments) {
Logic? result;
bool check = true;
bool? notEqualBut;
bool? leftSideOnly;
bool? rightSideOnly;
bool? leftSideIsNewer;
const char plus = '+';
bool? rightSideIsNewer;
const char minus = '-';
const char lessThan = 'L';
const int commentIndex = 5;
const char greaterThan = 'G';
const int notEqualButIndex = 2;
const int leftSideOnlyIndex = 0;
const int rightSideOnlyIndex = 4;
const int leftSideIsNewerIndex = 1;
const int rightSideIsNewerIndex = 3;
string comment = segments[commentIndex];
if (string.IsNullOrEmpty(segments[leftSideOnlyIndex])) {
leftSideOnly = null;
} else if (segments[leftSideOnlyIndex][0] == plus) {
leftSideOnly = true;
} else if (segments[leftSideOnlyIndex][0] == minus) {
leftSideOnly = false;
} else {
check = false;
leftSideOnly = null;
}
if (string.IsNullOrEmpty(segments[leftSideIsNewerIndex])) {
leftSideIsNewer = null;
} else if (segments[leftSideIsNewerIndex][0] == greaterThan) {
leftSideIsNewer = true;
} else if (segments[leftSideIsNewerIndex][0] == lessThan) {
leftSideIsNewer = false;
} else {
check = false;
leftSideIsNewer = null;
}
if (string.IsNullOrEmpty(segments[notEqualButIndex])) {
notEqualBut = null;
} else if (segments[notEqualButIndex][0] == greaterThan) {
notEqualBut = true;
} else if (segments[notEqualButIndex][0] == lessThan) {
notEqualBut = false;
} else {
check = false;
notEqualBut = null;
}
if (string.IsNullOrEmpty(segments[rightSideIsNewerIndex])) {
rightSideIsNewer = null;
} else if (segments[rightSideIsNewerIndex][0] == greaterThan) {
rightSideIsNewer = true;
} else if (segments[rightSideIsNewerIndex][0] == lessThan) {
rightSideIsNewer = false;
} else {
check = false;
rightSideIsNewer = null;
}
if (string.IsNullOrEmpty(segments[rightSideOnlyIndex])) {
rightSideOnly = null;
} else if (segments[rightSideOnlyIndex][0] == plus) {
rightSideOnly = true;
} else if (segments[rightSideOnlyIndex][0] == minus) {
rightSideOnly = false;
} else {
check = false;
rightSideOnly = null;
}
result = !check ? null : new(Comment: comment,
GreaterThan: greaterThan,
LeftSideIsNewerIndex: leftSideIsNewerIndex,
LeftSideIsNewer: leftSideIsNewer,
LeftSideOnly: leftSideOnly,
LeftSideOnlyIndex: leftSideOnlyIndex,
LessThan: lessThan,
Minus: minus,
NotEqualBut: notEqualBut,
NotEqualButIndex: notEqualButIndex,
Plus: plus,
RightSideIsNewer: rightSideIsNewer,
RightSideIsNewerIndex: rightSideIsNewerIndex,
RightSideOnly: rightSideOnly,
Raw: segments,
RightSideOnlyIndex: rightSideOnlyIndex);
return result;
}
public override string ToString() {
string result = JsonSerializer.Serialize(this, Helper20250519Logic.Default.Logic);
return result;
}
}
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Logic))]
private partial class Helper20250519Logic : JsonSerializerContext {
}
internal static void LiveSync(ILogger<Worker> logger, List<string> args) {
logger.LogInformation(args[0]);
logger.LogInformation(args[1]);
logger.LogInformation(args[2]);
if (args[2].EndsWith("input.json") && !File.Exists(args[2])) {
File.WriteAllText(args[2], "{}");
}
string? json = !args[2].EndsWith("input.json") ? null : File.ReadAllText(args[2]);
Input input = string.IsNullOrEmpty(json)
? Input.Get(args)
: JsonSerializer.Deserialize(json, Helper20250519Input.Default.Input)
?? throw new Exception();
Logic? logic = input.Segments.Length != 6 ? null : Logic.Get(input.Segments);
if (logic is null || input.BaseAddresses.Length == 0) {
logger.LogInformation($"Invalid input!{Environment.NewLine}{input}");
} else {
Matcher matcher = GetMatcher(input.ExcludePatternsFile, input.IncludePatternsFile);
ReadOnlyCollection<Record> records = GetRecords(input.RightDirectory, matcher);
if (records.Count == 0) {
logger.LogInformation("No source records");
} else {
RelativePath relativePath = new(LeftDirectory: input.LeftDirectory, RightDirectory: input.RightDirectory, Records: records.ToArray());
json = JsonSerializer.Serialize(relativePath, Helper20250519RelativePath.Default.RelativePath);
if (string.IsNullOrEmpty(json)) {
LiveSync180(logger, logic, input.BaseAddresses, input.Page, relativePath);
} else {
File.WriteAllText(Path.Combine(input.RightDirectory, ".vscode", $"{nameof(RelativePath)}.json"), json);
}
}
}
}
private static Matcher GetMatcher(string excludePatternsFile, string includePatternsFile) {
Matcher result = new();
result.AddIncludePatterns(!File.Exists(includePatternsFile) ? ["*"] : File.ReadAllLines(includePatternsFile));
result.AddExcludePatterns(!File.Exists(excludePatternsFile) ? ["System Volume Information"] : File.ReadAllLines(excludePatternsFile));
return result;
}
private static ReadOnlyCollection<Record> GetRecords(string directory, Matcher matcher) {
List<Record> results = [
new(RelativePath: directory,
Size: 0,
Ticks: 0)];
Record record;
FileInfo fileInfo;
string relativePath;
ReadOnlyCollection<ReadOnlyCollection<string>> collection = Helpers.HelperDirectory.GetFilesCollection(directory, "*", "*");
foreach (ReadOnlyCollection<string> c in collection) {
foreach (string f in c) {
if (!matcher.Match(directory, f).HasMatches) {
continue;
}
fileInfo = new(f);
if (fileInfo.Length == 0) {
continue;
}
relativePath = Path.GetRelativePath(directory, fileInfo.FullName);
record = new(RelativePath: relativePath,
Size: fileInfo.Length,
Ticks: fileInfo.LastWriteTime.ToUniversalTime().Ticks);
results.Add(record);
}
}
return results.AsReadOnly();
}
private static void LiveSync180(ILogger<Worker> logger, Logic logic, string[] baseAddresses, string page, RelativePath relativePath) {
Review? review;
Task<string> response;
Task<HttpResponseMessage> httpResponseMessage;
string json = JsonSerializer.Serialize(relativePath, Helper20250519RelativePath.Default.RelativePath);
foreach (string baseAddress in baseAddresses) {
if (!baseAddress.StartsWith("http:")) {
logger.LogInformation("Not supported URL <{url}>", baseAddress);
} else {
HttpClient httpClient = new();
httpClient.BaseAddress = new(baseAddress);
StringContent stringContent = new(json, Encoding.UTF8, "application/json");
httpResponseMessage = httpClient.PostAsync(page, stringContent);
httpResponseMessage.Wait();
if (!httpResponseMessage.Result.IsSuccessStatusCode) {
logger.LogInformation("Failed to download: <{uniformResourceLocator}>;", httpClient.BaseAddress);
} else {
response = httpResponseMessage.Result.Content.ReadAsStringAsync();
response.Wait();
review = JsonSerializer.Deserialize(response.Result, Helper20250519Review.Default.Review);
if (review is null) {
logger.LogInformation("Failed to download: <{uniformResourceLocator}>;", httpClient.BaseAddress);
continue;
}
LiveSync(logger, logic, page, relativePath, httpClient, review);
}
}
}
}
private static void LiveSync(ILogger<Worker> logger, Logic l, string page, RelativePath relativePath, HttpClient httpClient, Review review) {
if (review.NotEqualBut?.Length > 0 && l is not null && l.NotEqualBut is not null && l.Raw[l.NotEqualButIndex][0] == l.Minus && !l.NotEqualBut.Value) {
logger.LogDebug("Doing nothing with {name}", nameof(Logic.NotEqualBut));
}
if (review.LeftSideOnly?.Length > 0 && l is not null && l.LeftSideOnly is not null && l.Raw[l.LeftSideOnlyIndex][0] == l.Minus && !l.LeftSideOnly.Value) {
LiveSync(logger, page, relativePath, httpClient, relativePath.LeftDirectory, (from x in review.LeftSideOnly select x.Left).ToArray().AsReadOnly(), HttpMethod.Delete, delete: false);
}
if (review.LeftSideIsNewer?.Length > 0 && l is not null && l.LeftSideIsNewer is not null && l.Raw[l.LeftSideIsNewerIndex][0] == l.LessThan && !l.LeftSideIsNewer.Value) {
throw new Exception(); // LiveSync(logger, page, relativePath, httpClient, relativePath.LeftDirectory, (from x in review.LeftSideIsNewer select x.Left).ToArray().AsReadOnly(), HttpMethod.Patch, delete: true);
}
if (review.RightSideIsNewer?.Length > 0 && l is not null && l.RightSideIsNewer is not null && l.Raw[l.RightSideIsNewerIndex][0] == l.LessThan && !l.RightSideIsNewer.Value) {
throw new Exception(); // LiveSync(logger, page, relativePath, httpClient, relativePath.RightDirectory, (from x in review.RightSideIsNewer select x.Right).ToArray().AsReadOnly(), HttpMethod.Patch, delete: true);
}
if (review.RightSideOnly?.Length > 0 && l is not null && l.RightSideOnly is not null && l.Raw[l.RightSideOnlyIndex][0] == l.Plus && l.RightSideOnly.Value) {
throw new Exception(); // LiveSync(logger, page, relativePath, httpClient, relativePath.RightDirectory, (from x in review.RightSideOnly select x.Right).ToArray().AsReadOnly(), HttpMethod.Put, delete: false);
}
if (review.RightSideOnly?.Length > 0 && l is not null && l.RightSideOnly is not null && l.Raw[l.RightSideOnlyIndex][0] == l.Minus && !l.RightSideOnly.Value) {
LiveSync(logger, page, relativePath, httpClient, relativePath.RightDirectory, (from x in review.RightSideOnly select x.Right).ToArray().AsReadOnly(), httpMethod: null, delete: true);
}
if (review.LeftSideOnly?.Length > 0 && l is not null && l.LeftSideOnly is not null && l.Raw[l.LeftSideOnlyIndex][0] == l.Plus && l.LeftSideOnly.Value) {
LiveSync(logger, page, relativePath, httpClient, relativePath.LeftDirectory, (from x in review.LeftSideOnly select x.Left).ToArray().AsReadOnly(), HttpMethod.Get, delete: false);
}
if (review.LeftSideIsNewer?.Length > 0 && l is not null && l.LeftSideIsNewer is not null && l.Raw[l.LeftSideIsNewerIndex][0] == l.GreaterThan && l.LeftSideIsNewer.Value) {
LiveSync(logger, page, relativePath, httpClient, relativePath.LeftDirectory, (from x in review.LeftSideIsNewer select x.Left).ToArray().AsReadOnly(), HttpMethod.Get, delete: true);
}
if (review.NotEqualBut?.Length > 0 && l is not null && l.NotEqualBut is not null && l.Raw[l.NotEqualButIndex][0] == l.Plus && l.NotEqualBut.Value) {
LiveSync(logger, page, relativePath, httpClient, relativePath.LeftDirectory, (from x in review.NotEqualBut select x.Left).ToArray().AsReadOnly(), HttpMethod.Get, delete: true);
}
if (review.RightSideIsNewer?.Length > 0 && l is not null && l.RightSideIsNewer is not null && l.Raw[l.RightSideIsNewerIndex][0] == l.GreaterThan && l.RightSideIsNewer.Value) {
LiveSync(logger, page, relativePath, httpClient, relativePath.RightDirectory, (from x in review.RightSideIsNewer select x.Right).ToArray().AsReadOnly(), HttpMethod.Get, delete: true);
}
}
private static void LiveSync(ILogger<Worker> logger, string page, RelativePath relativePath, HttpClient httpClient, string directory, ReadOnlyCollection<Record> records, HttpMethod? httpMethod, bool delete) {
long sum;
try { sum = records.Sum(l => l.Size); } catch (Exception) { sum = 0; }
string size = GetSizeWithSuffix(sum);
if (delete) {
logger.LogInformation("Starting to delete {count} file(s) [{sum}]", records.Count, size);
PreformDeletes(logger, relativePath.RightDirectory, records);
logger.LogInformation("Deleted {count} file(s) [{sum}]", records.Count, size);
}
if (httpMethod is not null) {
logger.LogInformation("Starting to {httpMethod} {count} file(s) [{sum}]", httpMethod.ToString().ToLower(), records.Count, size);
Preform(logger, page, directory, records, httpClient, httpMethod);
logger.LogInformation("{httpMethod}'ed {count} file(s) [{sum}]", httpMethod.ToString(), records.Count, size);
}
}
private static string GetSizeWithSuffix(long value) {
string result;
int i = 0;
string[] SizeSuffixes = ["bytes", "KB", "MB", "GB", "TB", "PB", "EB", "ZB", "YB"];
if (value < 0) {
result = "-" + GetSizeWithSuffix(-value);
} else {
while (Math.Round(value / 1024f) >= 1) {
value /= 1024;
i++;
}
result = string.Format("{0:n1} {1}", value, SizeSuffixes[i]);
}
return result;
}
private static void PreformDeletes(ILogger<Worker> logger, string directory, ReadOnlyCollection<Record> records) {
string size;
Record? record;
string count = records.Count.ToString("000000");
#if ShellProgressBar
ProgressBar progressBar = new(records.Count, $"Deleting: {count};", new ProgressBarOptions() { ProgressCharacter = '─', ProgressBarOnBottom = true, DisableBottomPercentage = true });
#endif
for (int i = 0; i < records.Count; i++) {
#if ShellProgressBar
progressBar.Tick();
#endif
record = records[i];
if (record is null) {
continue;
}
size = GetSizeWithSuffix(record.Size);
try {
File.Delete(Path.Combine(directory, record.RelativePath));
logger.LogInformation("{i} of {count} - Deleted: <{RelativePath}> - {size};", i.ToString("000000"), count, record.RelativePath, size);
} catch (Exception) {
logger.LogInformation("Failed to delete: <{RelativePath}> - {size};", record.RelativePath, size);
}
}
#if ShellProgressBar
progressBar.Dispose();
#endif
}
private static void Preform(ILogger<Worker> logger, string page, string directory, ReadOnlyCollection<Record> records, HttpClient httpClient, HttpMethod httpMethod) {
Verb verb;
long ticks;
string size;
string iValue;
string duration;
DateTime dateTime;
Task<string> response;
HttpRequestMessage httpRequestMessage;
Task<HttpResponseMessage> httpResponseMessage;
string count = records.Count.ToString("000000");
MultipartFormDataContent multipartFormDataContent;
ReadOnlyCollection<Verb> collection = GetVerbCollection(directory, records);
#if ShellProgressBar
ProgressBar progressBar = new(downloads.Count, $"{httpMethod}ing: {count};", new ProgressBarOptions() { ProgressCharacter = '─', ProgressBarOnBottom = true, DisableBottomPercentage = true });
#endif
for (int i = 0; i < collection.Count; i++) {
verb = collection[i];
#if ShellProgressBar
progressBar.Tick();
#endif
ticks = DateTime.Now.Ticks;
iValue = (i + 1).ToString("000000");
size = GetSizeWithSuffix(verb.Size);
if (httpMethod == HttpMethod.Get || httpMethod == HttpMethod.Delete) {
httpRequestMessage = new(httpMethod, $"{page}size={verb.Size}&ticks={verb.Ticks}&path={verb.UrlEncodedFile}");
} else if (httpMethod == HttpMethod.Patch || httpMethod == HttpMethod.Put) {
httpRequestMessage = new(httpMethod, $"{page}path={verb.Directory}");
multipartFormDataContent = new();
multipartFormDataContent.Add(new ByteArrayContent(File.ReadAllBytes(verb.File)), "formFiles", verb.Multipart);
multipartFormDataContent.Add(new StringContent(verb.Directory), "path", iValue);
httpRequestMessage.Content = multipartFormDataContent;
} else
throw new NotImplementedException();
httpResponseMessage = httpClient.SendAsync(httpRequestMessage);
httpResponseMessage.Wait(-1);
if (!httpResponseMessage.Result.IsSuccessStatusCode) {
logger.LogInformation("Failed to {httpMethod}: <{display}> - {size};", httpMethod, verb.Display, size);
} else {
try {
if (httpMethod != HttpMethod.Get) {
duration = GetDurationWithSuffix(ticks);
} else {
response = httpResponseMessage.Result.Content.ReadAsStringAsync();
response.Wait();
File.WriteAllText(verb.File, response.Result);
duration = GetDurationWithSuffix(ticks);
dateTime = new DateTime(verb.Ticks).ToLocalTime();
File.SetLastWriteTime(verb.File, dateTime);
}
logger.LogInformation("{i} of {count} - {httpMethod}'ed: <{display}> - {size} - {timeSpan};",
iValue,
count,
httpMethod,
verb.Display,
size,
duration);
} catch (Exception) {
logger.LogInformation("Failed to {httpMethod}: <{display}> - {size};", httpMethod, verb.Display, size);
}
}
}
#if ShellProgressBar
progressBar.Dispose();
#endif
}
private static ReadOnlyCollection<Verb> GetVerbCollection(string directory, ReadOnlyCollection<Record> records) {
List<Verb> results = [];
Verb verb;
string checkFile;
string checkFileName;
string? checkDirectory;
List<Verb> collection = [];
foreach (Record record in records) {
checkFile = Path.Combine(directory, record.RelativePath);
checkFileName = Path.GetFileName(checkFile);
checkDirectory = Path.GetDirectoryName(checkFile);
if (string.IsNullOrEmpty(checkDirectory)) {
continue;
}
if (!Directory.Exists(checkDirectory)) {
_ = Directory.CreateDirectory(checkDirectory);
}
if (File.Exists(checkFile) && new FileInfo(checkFile).Length == 0) {
File.Delete(checkFile);
}
verb = new(Directory: checkDirectory,
Display: $"{checkFileName}{Environment.NewLine}{checkDirectory}",
File: checkFile,
Multipart: $"RelativePath:{record.RelativePath}|Size:{record.Size}|Ticks:{record.Ticks};",
RelativePath: record.RelativePath,
Size: record.Size,
Ticks: record.Ticks,
UrlEncodedFile: HttpUtility.UrlEncode(checkFile));
collection.Add(verb);
}
Verb[] sorted = (from l in collection orderby l.Size select l).ToArray();
int stop = sorted.Length < 100 ? sorted.Length : 100;
for (int i = 0; i < stop; i++) {
results.Add(sorted[i]);
}
for (int i = sorted.Length - 1; i > stop - 1; i--) {
results.Add(sorted[i]);
}
if (collection.Count != results.Count) {
throw new Exception();
}
return results.AsReadOnly();
}
private static string GetDurationWithSuffix(long ticks) {
string result;
TimeSpan timeSpan = new(DateTime.Now.Ticks - ticks);
if (timeSpan.TotalMilliseconds < 1000) {
result = $"{timeSpan.Milliseconds} ms";
} else if (timeSpan.TotalMilliseconds < 60000) {
result = $"{Math.Floor(timeSpan.TotalSeconds)} s";
} else if (timeSpan.TotalMilliseconds < 3600000) {
result = $"{Math.Floor(timeSpan.TotalMinutes)} m";
} else {
result = $"{Math.Floor(timeSpan.TotalHours)} h";
}
return result;
}
}

View File

@ -0,0 +1,184 @@
using System.Collections.ObjectModel;
using System.Globalization;
using System.Text.Json;
using System.Text.Json.Serialization;
using System.Text.RegularExpressions;
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI6;
internal static partial class Helper20250521 {
[GeneratedRegex(@"[~\-,.0-9]")]
private static partial Regex Number();
[GeneratedRegex(@"[^\u0020-\u007E]")]
private static partial Regex ASCII();
private record Record(string Directory, string FileNameWithoutExtension);
private record LineCheck(string[] Segments, DateTime TransactionDate, DateTime EffectiveDate) {
internal static LineCheck Get(int dateLineSegmentCount, string datePattern, string line) {
LineCheck result;
string[] segments = line.Split(' ');
if (segments.Length >= dateLineSegmentCount
&& segments[0].Length == datePattern.Length
&& segments[1].Length == datePattern.Length
&& DateTime.TryParseExact(segments[0], datePattern, CultureInfo.InvariantCulture, DateTimeStyles.None, out DateTime transactionDate)
&& DateTime.TryParseExact(segments[1], datePattern, CultureInfo.InvariantCulture, DateTimeStyles.None, out DateTime effectiveDate)) {
result = new(Segments: segments, TransactionDate: transactionDate, EffectiveDate: effectiveDate);
} else {
result = new(Segments: segments, TransactionDate: DateTime.MinValue, EffectiveDate: DateTime.MinValue);
}
return result;
}
}
private record RecordB(int I,
DateTime TransactionDate,
DateTime EffectiveDate,
string Description,
decimal WithdrawalOrDeposit,
decimal Balance);
[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(RecordB[]))]
private partial class Helper20250521RecordB : JsonSerializerContext {
}
internal static void MatchDirectory(ILogger<Worker> logger, List<string> args) {
Record record;
string datePattern = args[5];
string searchPattern = args[2];
string searchPatternB = args[3];
string columns = args[6].Replace('~', ',');
int dateLineSegmentCount = int.Parse(args[4]);
string sourceDirectory = Path.GetFullPath(args[0].Split('~')[0]);
ReadOnlyDictionary<string, string> keyValuePairs = GetKeyValuePairs(searchPattern, sourceDirectory);
MoveMatchDirectory(searchPatternB, keyValuePairs, sourceDirectory);
ReadOnlyCollection<RecordB> records = GetRecords(searchPatternB, sourceDirectory, dateLineSegmentCount, datePattern, columns);
WriteRecords(sourceDirectory, records);
}
private static ReadOnlyDictionary<string, string> GetKeyValuePairs(string searchPattern, string sourceDirectory) {
Dictionary<string, string> results = [];
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.AllDirectories);
foreach (string file in files) {
results.Add(Path.GetFileNameWithoutExtension(file), Path.GetDirectoryName(file));
}
return results.AsReadOnly();
}
private static void MoveMatchDirectory(string searchPatternB, ReadOnlyDictionary<string, string> keyValuePairs, string sourceDirectory) {
string checkFile;
string fileNameWithoutExtension;
string[] files = Directory.GetFiles(sourceDirectory, searchPatternB, SearchOption.AllDirectories);
foreach (string file in files) {
fileNameWithoutExtension = Path.GetFileNameWithoutExtension(file);
if (!keyValuePairs.TryGetValue(fileNameWithoutExtension, out string? match))
continue;
checkFile = Path.Combine(match, Path.GetFileName(file));
if (File.Exists(checkFile))
continue;
File.Move(file, checkFile);
}
}
private static ReadOnlyCollection<RecordB> GetRecords(string searchPatternB, string sourceDirectory, int dateLineSegmentCount, string datePattern, string columns) {
List<RecordB> results = [];
string line;
string[] lines;
RecordB? record;
LineCheck lineCheck;
string[] files = Directory.GetFiles(sourceDirectory, searchPatternB, SearchOption.AllDirectories);
foreach (string file in files) {
lines = File.ReadAllLines(file);
for (int i = 0; i < lines.Length; i++) {
line = lines[i];
if (string.IsNullOrEmpty(line)) {
continue;
}
lineCheck = LineCheck.Get(dateLineSegmentCount, datePattern, line);
if (lineCheck.EffectiveDate == DateTime.MinValue || lineCheck.TransactionDate == DateTime.MinValue) {
continue;
} else {
record = GetRecord(dateLineSegmentCount, datePattern, lines, i, lineCheck.Segments, lineCheck.TransactionDate, lineCheck.EffectiveDate);
if (record is not null) {
i = record.I;
results.Add(record);
}
}
}
}
return results.AsReadOnly();
}
private static RecordB? GetRecord(int dateLineSegmentCount, string datePattern, string[] lines, int i, string[] segments, DateTime transactionDate, DateTime effectiveDate) {
RecordB? result = null;
string line;
RecordB record;
LineCheck lineCheck;
List<string> collection = [];
for (int j = i + 1; j < lines.Length; j++) {
line = lines[j];
if (string.IsNullOrEmpty(line)) {
continue;
}
lineCheck = LineCheck.Get(dateLineSegmentCount, datePattern, line);
if (lineCheck.EffectiveDate == DateTime.MinValue || lineCheck.TransactionDate == DateTime.MinValue) {
collection.Add(line);
} else {
if (lineCheck.Segments.Length > dateLineSegmentCount) {
collection.Insert(0, string.Join(' ', lineCheck.Segments.Skip(2)));
}
result = GetRecord(transactionDate, effectiveDate, collection.AsReadOnly(), j - 1);
break;
}
}
if (result is null && collection.Count > 0) {
result = GetRecord(transactionDate, effectiveDate, collection.AsReadOnly(), lines.Length - 1);
}
return result;
}
private static RecordB GetRecord(DateTime transactionDate, DateTime effectiveDate, ReadOnlyCollection<string> collection, int i) {
RecordB? result;
List<string> verified = [];
foreach (string check in collection) {
if (Number().Replace(check, string.Empty).Length != 0) {
verified.Clear();
} else {
verified.Add(check);
}
if (verified.Count == 2) {
break;
}
}
if (verified.Count != 2) {
result = null;
} else {
decimal balance = decimal.Parse(verified[^1]);
decimal withdrawalOrDeposit = decimal.Parse(verified[^2]);
string description = ASCII().Replace(string.Join(' ', collection.SkipLast(2)), string.Empty);
result = new(I: i,
TransactionDate: transactionDate,
EffectiveDate: effectiveDate,
Description: description,
WithdrawalOrDeposit: withdrawalOrDeposit,
Balance: balance);
}
return result;
}
private static void WriteRecords(string sourceDirectory, ReadOnlyCollection<RecordB> records) {
string json = JsonSerializer.Serialize(records.ToArray(), Helper20250521RecordB.Default.RecordBArray);
string sourceDirectoryVsCode = Path.Combine(sourceDirectory, ".vscode");
if (!Directory.Exists(sourceDirectoryVsCode))
_ = Directory.CreateDirectory(sourceDirectoryVsCode);
File.WriteAllText(Path.Combine(sourceDirectoryVsCode, $"{DateTime.Now.Ticks}.json"), json);
}
}

View File

@ -0,0 +1,394 @@
using System.Collections.ObjectModel;
using System.Globalization;
using System.Net;
using System.Text;
using System.Xml;
using System.Xml.Serialization;
using IFX.Shared.PasteSpecialXml.EAF.XML.API.Envelope;
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI6;
internal static partial class Helper20250601 {
private static readonly bool _IsEnvironment_Development = false;
private record Record(string Text, string Host, int Port, string[] Segments, bool StateContainsDisabled);
private record Status(string CellInstanceName,
string CommunicationState,
string CurrentActiveVersion,
string CurrentHost,
string ErrorDescription,
string Host,
string IsReadyForRestart,
string NPort,
int Port,
string StartTime,
string Startable,
string State,
string StopTime,
string Text);
internal static void EquipmentAutomationFrameworkStatus(ILogger<Worker> logger, List<string> args) {
Status status;
Record? record;
List<string[]> messages;
logger.LogInformation(args[0]);
logger.LogInformation(args[1]);
logger.LogInformation(args[2]);
string[] cellInstanceNames = args[2].Split('~');
Dictionary<string, Record> records;
if (_IsEnvironment_Development) {
records = GetEquipmentAutomationFrameworkCellInstanceStatus(development: true, staging: false, production: false);
} else {
records = GetEquipmentAutomationFrameworkCellInstanceStatus(development: false, staging: true, production: true);
}
foreach (string cellInstanceName in cellInstanceNames) {
if (!records.TryGetValue(cellInstanceName, out record)) {
logger.LogWarning("{cellInstance} not found!", cellInstanceName);
continue;
}
status = EquipmentAutomationFrameworkCellInstanceStatus(cellInstanceName, record);
logger.LogInformation("{host}) {cellInstanceName} => {status}", record.Host, cellInstanceName, status.ToString());
}
}
private static Dictionary<string, Record> GetEquipmentAutomationFrameworkCellInstanceStatus(bool development, bool staging, bool production) {
Dictionary<string, Record> results = [];
string key;
string host;
string text;
string state;
string response;
bool stop = false;
string[] segments;
string[] cellNames;
byte[] responseBytes;
string responseAfter;
#pragma warning disable SYSLIB0014
WebClient webClient = new();
#pragma warning restore SYSLIB0014
string disabled = "Disabled";
UnicodeCategory unicodeCategory;
StringBuilder stringBuilder = new();
EquipmentAutomationFrameworkCellInstanceParseCheck();
Dictionary<char, char> unicodeReplaces = GetUnicodeReplaces();
List<UnicodeCategory> unicodeCategories = GetUnicodeCategory();
ReadOnlyCollection<string> urls = GetUrls(development, staging, production);
// Dictionary<UnicodeCategory, List<char>> unicodeCategoriesList = new Dictionary<UnicodeCategory, List<char>>();
byte[] bodyBytes = [86, 2, 11, 1, 115, 4, 11, 1, 97, 6, 86, 8, 68, 10, 30, 0, 130, 153, 48, 104, 116, 116, 112, 58, 47, 47, 116, 101, 109, 112, 117, 114, 105, 46, 111, 114, 103, 47, 73, 83, 116, 97, 116, 117, 115, 81, 117, 101, 114, 121, 47, 71, 101, 116, 70, 97, 99, 116, 111, 114, 121, 83, 116, 97, 116, 117, 115, 68, 26, 173, 181, 241, 2, 149, 65, 209, 208, 66, 143, 234, 233, 157, 246, 118, 78, 238, 68, 44, 68, 42, 171, 20, 1, 68, 12, 30, 0, 130, 153, 49, 104, 116, 116, 112, 58, 47, 47, 101, 97, 102, 45, 112, 114, 111, 100, 46, 109, 101, 115, 46, 105, 110, 102, 105, 110, 101, 111, 110, 46, 99, 111, 109, 58, 57, 48, 48, 51, 47, 83, 116, 97, 116, 117, 115, 81, 117, 101, 114, 121, 1, 86, 14, 64, 16, 71, 101, 116, 70, 97, 99, 116, 111, 114, 121, 83, 116, 97, 116, 117, 115, 8, 19, 104, 116, 116, 112, 58, 47, 47, 116, 101, 109, 112, 117, 114, 105, 46, 111, 114, 103, 47, 64, 16, 105, 110, 99, 108, 117, 100, 101, 65, 103, 101, 110, 116, 76, 105, 115, 116, 135, 64, 17, 105, 110, 99, 108, 117, 100, 101, 83, 116, 97, 116, 117, 115, 76, 105, 115, 116, 135, 64, 23, 101, 120, 116, 101, 110, 100, 101, 100, 83, 116, 97, 116, 117, 115, 67, 101, 108, 108, 78, 97, 109, 101, 115, 9, 1, 98, 57, 104, 116, 116, 112, 58, 47, 47, 115, 99, 104, 101, 109, 97, 115, 46, 109, 105, 99, 114, 111, 115, 111, 102, 116, 46, 99, 111, 109, 47, 50, 48, 48, 51, 47, 49, 48, 47, 83, 101, 114, 105, 97, 108, 105, 122, 97, 116, 105, 111, 110, 47, 65, 114, 114, 97, 121, 115, 9, 1, 105, 41, 104, 116, 116, 112, 58, 47, 47, 119, 119, 119, 46, 119, 51, 46, 111, 114, 103, 47, 50, 48, 48, 49, 47, 88, 77, 76, 83, 99, 104, 101, 109, 97, 45, 105, 110, 115, 116, 97, 110, 99, 101, 95, 6, 115, 116, 114, 105, 110, 103, 153, 20, 66, 73, 79, 82, 65, 68, 53, 95, 70, 105, 108, 101, 65, 114, 99, 104, 105, 118, 101, 114, 1, 1, 1, 1];
foreach (string url in urls) {
if (stop) {
break;
}
segments = url.Split(':');
host = segments[0];
if (segments.Length == 0 || !int.TryParse(segments[1], out int port)) {
port = 80;
}
webClient.Headers.Clear();
webClient.Headers.Add("Accept-Encoding: gzip, deflate");
webClient.Headers.Add("Content-Type: application/soap+msbin1");
responseBytes = webClient.UploadData($"http://{host}:{port}/StatusQuery", bodyBytes);
// File.WriteAllText(@"L:\Tmp\a.txt", BitConverter.ToString(responseBytes));
response = Encoding.UTF8.GetString(responseBytes);
foreach (char c in response) {
unicodeCategory = CharUnicodeInfo.GetUnicodeCategory(c);
if (unicodeCategory == UnicodeCategory.Control && unicodeReplaces.ContainsKey(c)) {
_ = stringBuilder.Append(unicodeReplaces[c]);
} else if (unicodeCategories.Contains(unicodeCategory)) {
_ = stringBuilder.Append(c);
}
}
responseAfter = stringBuilder.ToString();
cellNames = responseAfter.Split(new string[] { "CellName" }, StringSplitOptions.None);
foreach (string segment in cellNames) {
if (stop) {
break;
}
key = string.Empty;
state = string.Empty;
segments = segment.Split(new string[] { "WindowsName" }, StringSplitOptions.None);
if (segments.Length != 2) {
continue;
}
text = segments[0];
segments = text.Replace('\r', ' ').Replace('\n', ' ').Split(' ');
for (int i = 0; i < segments.Length - 3; i++) {
if (stop) {
break;
}
if (!string.IsNullOrEmpty(segments[i]) && string.IsNullOrEmpty(key)) {
key = segments[i].Trim();
} else if (segments[i].StartsWith("State")) {
state = segments[i + 1];
break;
}
}
if (key.EndsWith("a")) {
key = key[..^1];
}
if (!results.ContainsKey(key)) {
results.Add(key, new Record(Text: text, Host: host, Port: port, Segments: segments, StateContainsDisabled: state.Contains(disabled)));
} else if (results[key].StateContainsDisabled) {
results[key] = new Record(Text: text, Host: host, Port: port, Segments: segments, StateContainsDisabled: state.Contains(disabled));
}
}
}
return results;
}
private static ReadOnlyCollection<string> GetUrls(bool development, bool staging, bool production) {
List<string> results = [];
if (development) {
results.Add("eaf-dev.mes.infineon.com:9003");
}
if (staging) {
results.Add("eaf-staging.mes.infineon.com:9003");
}
if (production) {
results.Add("eaf-prod.mes.infineon.com:9003");
}
return results.AsReadOnly();
}
private static List<UnicodeCategory> GetUnicodeCategory() {
List<UnicodeCategory> unicodeCategories = [
// UnicodeCategory.Control, // 33 - <20>
UnicodeCategory.UppercaseLetter, // 25 - ABCDEFGHIJKLMNOPQRSTUVWXY
UnicodeCategory.LowercaseLetter, // 25 - abcdefghiklmnopqrstuvwxyz
UnicodeCategory.DecimalDigitNumber, // 10 - 0123456789
UnicodeCategory.OtherPunctuation, // 10 - !"#%&,./:@
UnicodeCategory.ClosePunctuation, // 2 - )]
UnicodeCategory.MathSymbol, // 2 - |؈
UnicodeCategory.OpenPunctuation, // 2 - ([
// UnicodeCategory.OtherSymbol, // 1 - <20>
UnicodeCategory.DashPunctuation, // 1 - -
UnicodeCategory.ConnectorPunctuation, // 1 - _
UnicodeCategory.ModifierSymbol, // 1 - `
UnicodeCategory.NonSpacingMark, // 1 - ̵
UnicodeCategory.SpaceSeparator, // 1 -
UnicodeCategory.CurrencySymbol, // 1 - $
];
return unicodeCategories;
}
private static void EquipmentAutomationFrameworkCellInstanceParseCheck() {
Envelope? envelope;
string xmlStart621 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/CreateSequenceResponse</a:Action><a:RelatesTo>urn:uuid:6eb7a538-0b2b-4d04-8f2a-ab50e1e5338a</a:RelatesTo></s:Header><s:Body><CreateSequenceResponse xmlns=\"http://schemas.xmlsoap.org/ws/2005/02/rm\"><Identifier>urn:uuid:31c290af-2312-4b00-a57c-d5e1ab51e02a</Identifier><Accept><AcksTo><a:Address>http://eaf-prod.mes.infineon.com:9003/CellControllerManager</a:Address></AcksTo></Accept></CreateSequenceResponse></s:Body></s:Envelope>";
string xmlStart891 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:Sequence s:mustUnderstand=\"1\"><r:Identifier>urn:uuid:f169e50f-5ca8-43cd-a1e9-724840ff5e00</r:Identifier><r:MessageNumber>1</r:MessageNumber></r:Sequence><r:SequenceAcknowledgement><r:Identifier>urn:uuid:31c290af-2312-4b00-a57c-d5e1ab51e02a</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"1\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://tempuri.org/ICellControllerManager/StartAllCellInstancesResponse</a:Action><a:RelatesTo>urn:uuid:38977fa4-262a-42fb-8df7-d8d3074820b2</a:RelatesTo></s:Header><s:Body><StartAllCellInstancesResponse xmlns=\"http://tempuri.org/\"/></s:Body></s:Envelope>";
string xmlStart748 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:Sequence s:mustUnderstand=\"1\"><r:Identifier>urn:uuid:f169e50f-5ca8-43cd-a1e9-724840ff5e00</r:Identifier><r:MessageNumber>2</r:MessageNumber><r:LastMessage/></r:Sequence><r:SequenceAcknowledgement><r:Identifier>urn:uuid:31c290af-2312-4b00-a57c-d5e1ab51e02a</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"2\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/LastMessage</a:Action></s:Header><s:Body/></s:Envelope>";
string xmlStart707 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:SequenceAcknowledgement><r:Identifier>urn:uuid:31c290af-2312-4b00-a57c-d5e1ab51e02a</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"2\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/TerminateSequence</a:Action></s:Header><s:Body><r:TerminateSequence><r:Identifier>urn:uuid:f169e50f-5ca8-43cd-a1e9-724840ff5e00</r:Identifier></r:TerminateSequence></s:Body></s:Envelope>";
string xmlStop621 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/CreateSequenceResponse</a:Action><a:RelatesTo>urn:uuid:97f7aeb4-015f-440b-b0ff-a2a5aa4f4ab9</a:RelatesTo></s:Header><s:Body><CreateSequenceResponse xmlns=\"http://schemas.xmlsoap.org/ws/2005/02/rm\"><Identifier>urn:uuid:e34d16ad-21d5-4a11-a6dc-5b5b58a74f96</Identifier><Accept><AcksTo><a:Address>http://eaf-prod.mes.infineon.com:9003/CellControllerManager</a:Address></AcksTo></Accept></CreateSequenceResponse></s:Body></s:Envelope>";
string xmlStop889 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:Sequence s:mustUnderstand=\"1\"><r:Identifier>urn:uuid:c9a4d5b6-435b-49a4-a2f9-d93cd8aecc36</r:Identifier><r:MessageNumber>1</r:MessageNumber></r:Sequence><r:SequenceAcknowledgement><r:Identifier>urn:uuid:e34d16ad-21d5-4a11-a6dc-5b5b58a74f96</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"1\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://tempuri.org/ICellControllerManager/StopAllCellInstancesResponse</a:Action><a:RelatesTo>urn:uuid:04b8b0ea-8576-4756-b456-8a817cd10826</a:RelatesTo></s:Header><s:Body><StopAllCellInstancesResponse xmlns=\"http://tempuri.org/\"/></s:Body></s:Envelope>";
string xmlStop748 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:Sequence s:mustUnderstand=\"1\"><r:Identifier>urn:uuid:c9a4d5b6-435b-49a4-a2f9-d93cd8aecc36</r:Identifier><r:MessageNumber>2</r:MessageNumber><r:LastMessage/></r:Sequence><r:SequenceAcknowledgement><r:Identifier>urn:uuid:e34d16ad-21d5-4a11-a6dc-5b5b58a74f96</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"2\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/LastMessage</a:Action></s:Header><s:Body/></s:Envelope>";
string xmlStop707 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:SequenceAcknowledgement><r:Identifier>urn:uuid:e34d16ad-21d5-4a11-a6dc-5b5b58a74f96</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"2\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/TerminateSequence</a:Action></s:Header><s:Body><r:TerminateSequence><r:Identifier>urn:uuid:c9a4d5b6-435b-49a4-a2f9-d93cd8aecc36</r:Identifier></r:TerminateSequence></s:Body></s:Envelope>";
string xmlRestart621 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/CreateSequenceResponse</a:Action><a:RelatesTo>urn:uuid:e228a621-e7ab-4ebf-97ba-5571cb5f4ad7</a:RelatesTo></s:Header><s:Body><CreateSequenceResponse xmlns=\"http://schemas.xmlsoap.org/ws/2005/02/rm\"><Identifier>urn:uuid:a1650ed7-34dc-4fac-993f-ed2559c453a2</Identifier><Accept><AcksTo><a:Address>http://eaf-prod.mes.infineon.com:9003/CellControllerManager</a:Address></AcksTo></Accept></CreateSequenceResponse></s:Body></s:Envelope>";
string xmlRestart895 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:Sequence s:mustUnderstand=\"1\"><r:Identifier>urn:uuid:50c82506-bd4d-4117-b632-640cf84d556e</r:Identifier><r:MessageNumber>1</r:MessageNumber></r:Sequence><r:SequenceAcknowledgement><r:Identifier>urn:uuid:a1650ed7-34dc-4fac-993f-ed2559c453a2</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"1\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://tempuri.org/ICellControllerManager/RestartAllCellInstancesResponse</a:Action><a:RelatesTo>urn:uuid:efaeaf12-4aa0-4cd1-8296-05019e47261a</a:RelatesTo></s:Header><s:Body><RestartAllCellInstancesResponse xmlns=\"http://tempuri.org/\"/></s:Body></s:Envelope>";
string xmlRestart748 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:Sequence s:mustUnderstand=\"1\"><r:Identifier>urn:uuid:50c82506-bd4d-4117-b632-640cf84d556e</r:Identifier><r:MessageNumber>2</r:MessageNumber><r:LastMessage/></r:Sequence><r:SequenceAcknowledgement><r:Identifier>urn:uuid:a1650ed7-34dc-4fac-993f-ed2559c453a2</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"2\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/LastMessage</a:Action></s:Header><s:Body/></s:Envelope>";
string xmlRestart707 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:SequenceAcknowledgement><r:Identifier>urn:uuid:a1650ed7-34dc-4fac-993f-ed2559c453a2</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"2\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/TerminateSequence</a:Action></s:Header><s:Body><r:TerminateSequence><r:Identifier>urn:uuid:50c82506-bd4d-4117-b632-640cf84d556e</r:Identifier></r:TerminateSequence></s:Body></s:Envelope>";
string[] xmlSets = [xmlStart621, xmlStart891, xmlStart748, xmlStart707, xmlStop621, xmlStop889, xmlStop748, xmlStop707, xmlRestart621, xmlRestart895, xmlRestart748, xmlRestart707];
foreach (string xmlSet in xmlSets) {
envelope = ParseXML<Envelope>(xmlSet, throwExceptions: true);
}
}
private static T? ParseXML<T>(string value, bool throwExceptions) where T : class {
object? result = null;
try {
Stream stream = ToStream(value.Trim());
XmlReader xmlReader = XmlReader.Create(stream, new XmlReaderSettings() { ConformanceLevel = ConformanceLevel.Document });
#pragma warning disable IL2026, IL2090
XmlSerializer xmlSerializer = new(typeof(T), typeof(T).GetNestedTypes());
result = xmlSerializer.Deserialize(xmlReader);
#pragma warning restore IL2026, IL2090
stream.Dispose();
} catch (Exception) {
if (throwExceptions) {
throw;
}
}
return result as T;
}
private static Stream ToStream(string value) {
MemoryStream memoryStream = new();
StreamWriter streamWriter = new(memoryStream);
streamWriter.Write(value);
streamWriter.Flush();
memoryStream.Position = 0;
return memoryStream;
}
private static Dictionary<char, char> GetUnicodeReplaces() {
Dictionary<char, char> results = new() {
{ '\u0000', ' ' },
{ '\u0001', ' ' },
{ '\u0002', ' ' },
{ '\u0003', ' ' },
{ '\u0004', ' ' },
{ '\u0005', ' ' },
{ '\u0006', ' ' },
{ '\u0007', ' ' },
{ '\u0008', ' ' },
{ '\u0009', '\t' },
{ '\u000A', '\r' },
{ '\u000B', '\r' },
{ '\u000C', '\t' },
{ '\u000D', '\r' },
{ '\u000E', ' ' },
{ '\u000F', ' ' },
{ '\u0010', ' ' },
{ '\u0011', ' ' },
{ '\u0012', ' ' },
{ '\u0013', ' ' },
{ '\u0014', ' ' },
{ '\u0015', ' ' },
{ '\u0016', ' ' },
{ '\u0017', ' ' },
{ '\u0018', ' ' },
{ '\u0019', ' ' },
{ '\u001A', ' ' },
{ '\u001B', ' ' },
{ '\u001C', '\r' },
{ '\u001D', '\t' },
{ '\u001E', '\t' },
{ '\u001F', '\t' },
{ '\u007F', ' ' },
// C1
{ '\u0080', '\t' },
{ '\u0081', ' ' },
{ '\u0082', ' ' },
{ '\u0083', ' ' },
{ '\u0084', ' ' },
{ '\u0085', '\r' },
{ '\u0086', ' ' },
{ '\u0087', ' ' },
{ '\u0088', '\t' },
{ '\u0089', '\t' },
{ '\u008A', '\t' },
{ '\u008B', '\r' },
{ '\u008C', ' ' },
{ '\u008D', ' ' },
{ '\u008E', ' ' },
{ '\u008F', ' ' },
{ '\u0090', ' ' },
{ '\u0091', ' ' },
{ '\u0092', ' ' },
{ '\u0093', ' ' },
{ '\u0094', ' ' },
{ '\u0095', ' ' },
{ '\u0096', ' ' },
{ '\u0097', ' ' },
{ '\u0098', ' ' },
{ '\u0099', ' ' },
{ '\u009A', ' ' },
{ '\u009B', ' ' },
{ '\u009C', ' ' },
{ '\u009D', ' ' },
{ '\u009E', ' ' },
{ '\u009F', ' ' }
};
return results;
}
private static Status EquipmentAutomationFrameworkCellInstanceStatus(string cellInstanceName, Record record) {
Status result;
bool stop = false;
string state = string.Empty;
string stopTime = string.Empty;
string startTime = string.Empty;
string startable = string.Empty;
string currentHost = string.Empty;
string errorDescription = string.Empty;
string isReadyForRestart = string.Empty;
string communicationState = string.Empty;
string currentActiveVersion = string.Empty;
for (int i = 0; i < record.Segments.Length - 3; i++) {
if (stop) {
break;
}
if (string.IsNullOrEmpty(state) && record.Segments[i].StartsWith("State")) {
state = record.Segments[i + 1];
} else if (string.IsNullOrEmpty(startable) && record.Segments[i].Contains("Startable")) {
startable = record.Segments[i + 1];
} else if (string.IsNullOrEmpty(stopTime) && record.Segments[i].StartsWith("StopTime")) {
stopTime = record.Segments[i + 1];
} else if (string.IsNullOrEmpty(currentHost) && record.Segments[i].Contains("CurrentHost")) {
currentHost = $"{record.Segments[i]} {record.Segments[i + 1]} {record.Segments[i + 2]}";
} else if (string.IsNullOrEmpty(errorDescription) && record.Segments[i].StartsWith("ErrorDescription")) {
errorDescription = record.Segments[i + 1];
} else if (string.IsNullOrEmpty(communicationState) && record.Segments[i].StartsWith("CommunicationState")) {
communicationState = record.Segments[i + 1];
} else if (string.IsNullOrEmpty(isReadyForRestart) && record.Segments[i].StartsWith("IsReadyForRestart")) {
isReadyForRestart = record.Segments[i + 1];
} else if (string.IsNullOrEmpty(currentActiveVersion) && record.Segments[i].Contains("CurrentActiveVersion")) {
currentActiveVersion = record.Segments[i + 1];
} else if (string.IsNullOrEmpty(startTime) && record.Segments[i].Contains("StartTime")) {
startTime = $"{record.Segments[i + 1]} {record.Segments[i + 2]} {record.Segments[i + 3]}".Split('\t')[0];
}
}
if (errorDescription != "a") {
string[] segments = record.Text.Split(new string[] { "ErrorDescription" }, StringSplitOptions.RemoveEmptyEntries);
if (segments.Length > 1) {
segments = segments[1].Split(new string[] { "Info" }, StringSplitOptions.RemoveEmptyEntries);
errorDescription = segments[0].Trim();
}
}
string nPort;
Dictionary<string, string> nPorts = GetnPorts();
if (!nPorts.ContainsKey(cellInstanceName)) {
nPort = string.Empty;
} else {
nPort = nPorts[cellInstanceName];
}
if (state.EndsWith("a")) {
state = state[0..^1];
}
if (state == "Running" && communicationState == "Not") {
state = "Warning";
}
result = new(Host: record.Host,
Port: record.Port,
Text: record.Text,
NPort: nPort,
State: state,
CellInstanceName: cellInstanceName,
StopTime: stopTime,
StartTime: startTime,
Startable: startable,
CurrentHost: currentHost,
ErrorDescription: errorDescription,
IsReadyForRestart: isReadyForRestart,
CommunicationState: communicationState,
CurrentActiveVersion: currentActiveVersion);
return result;
}
private static Dictionary<string, string> GetnPorts() {
Dictionary<string, string> results = new() {
{ "TENCOR1", "10.95.192.31" },
{ "TENCOR2", "10.95.192.32" },
{ "TENCOR3", "10.95.192.33" },
{ "HGCV1", "10.95.192.34" },
{ "HGCV2", "10.95.154.17" },
{ "HGCV3", "10.95.192.36" },
{ "BIORAD2", "10.95.192.37" },
{ "BIORAD3", "10.95.192.38" },
{ "CDE2", "10.95.192.39" },
{ "CDE3", "10.95.154.19" },
{ "CDE5", "10.95.192.40" },
{ "SPARE-1", "10.95.192.47" },
{ "SPARE-2", "10.95.192.48" },
{ "SPARE-3", "10.95.192.49" },
{ "SPARE-4", "10.95.192.50" },
{ "SPARE-5", "10.95.192.51" },
};
return results;
}
}

View File

@ -0,0 +1,545 @@
using System.Collections.ObjectModel;
using System.Globalization;
using System.Net;
using System.Text;
using System.Xml;
using System.Xml.Serialization;
using IFX.Shared.PasteSpecialXml.EAF.XML.API.Envelope;
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI6;
internal static partial class Helper20250602 {
private record Record(string Text, string Host, int Port, string[] Segments, bool StateContainsDisabled);
private record Status(string CellInstanceName,
string CommunicationState,
string CurrentActiveVersion,
string CurrentHost,
string ErrorDescription,
string Host,
string IsReadyForRestart,
string NPort,
int Port,
string StartTime,
string Startable,
string State,
string StopTime,
string Text);
internal static void EquipmentAutomationFrameworkCellInstanceStateImageVerbIf(ILogger<Worker> logger, List<string> args) {
string path;
Status status;
Record? record;
string verbBy = args[2];
logger.LogInformation(args[0]);
logger.LogInformation(args[1]);
logger.LogInformation(args[2]);
logger.LogInformation(args[3]);
string[] cellInstanceNames = args[3].Split('~');
Dictionary<string, Record> records = GetEquipmentAutomationFrameworkCellInstanceStatus(development: false, staging: false, production: true);
foreach (string cellInstanceName in cellInstanceNames) {
if (!records.TryGetValue(cellInstanceName, out record)) {
logger.LogWarning("{cellInstance} not found!", cellInstanceName);
continue;
}
if (records[cellInstanceName].StateContainsDisabled) {
logger.LogWarning("{cellInstance} not found!", cellInstanceName);
}
status = GetEquipmentAutomationFrameworkCellInstanceStateImageVerbIf(logger, verbBy, cellInstanceName, record);
path = $"/images/{cellInstanceName}_{status.State}.jpg";
logger.LogInformation("{host}) {cellInstanceName} => {state} <{path}>", record.Host, cellInstanceName, status.State, path);
}
}
private static Status GetEquipmentAutomationFrameworkCellInstanceStateImageVerbIf(ILogger<Worker> logger, string verbBy, string cellInstanceName, Record record) {
Status result;
Dictionary<string, DateTime> equipmentAutomationFrameworkTriggers = [];
if (!equipmentAutomationFrameworkTriggers.ContainsKey(cellInstanceName)) {
equipmentAutomationFrameworkTriggers.Add(cellInstanceName, DateTime.MinValue);
}
result = EquipmentAutomationFrameworkCellInstanceStatus(cellInstanceName, record);
if (equipmentAutomationFrameworkTriggers[cellInstanceName] < DateTime.Now.AddSeconds(-60)) {
if (result.State == "Offline") {
EquipmentAutomationFrameworkCellInstanceStart(record.Host, record.Port, cellInstanceName, verbBy);
logger.LogInformation("Start invoked for {cellName}", cellInstanceName);
} else if (result.State == "Warning") {
EquipmentAutomationFrameworkCellInstanceRestart(record.Host, record.Port, cellInstanceName, verbBy);
logger.LogInformation("Restart invoked for {cellName}", cellInstanceName);
}
}
return result;
}
private static Dictionary<string, Record> GetEquipmentAutomationFrameworkCellInstanceStatus(bool development, bool staging, bool production) {
Dictionary<string, Record> results = [];
string key;
string host;
string text;
string state;
string response;
bool stop = false;
string[] segments;
string[] cellNames;
byte[] responseBytes;
string responseAfter;
#pragma warning disable SYSLIB0014
WebClient webClient = new();
#pragma warning restore SYSLIB0014
string disabled = "Disabled";
UnicodeCategory unicodeCategory;
StringBuilder stringBuilder = new();
EquipmentAutomationFrameworkCellInstanceParseCheck();
Dictionary<char, char> unicodeReplaces = GetUnicodeReplaces();
List<UnicodeCategory> unicodeCategories = GetUnicodeCategory();
ReadOnlyCollection<string> urls = GetUrls(development, staging, production);
// Dictionary<UnicodeCategory, List<char>> unicodeCategoriesList = new Dictionary<UnicodeCategory, List<char>>();
byte[] bodyBytes = [86, 2, 11, 1, 115, 4, 11, 1, 97, 6, 86, 8, 68, 10, 30, 0, 130, 153, 48, 104, 116, 116, 112, 58, 47, 47, 116, 101, 109, 112, 117, 114, 105, 46, 111, 114, 103, 47, 73, 83, 116, 97, 116, 117, 115, 81, 117, 101, 114, 121, 47, 71, 101, 116, 70, 97, 99, 116, 111, 114, 121, 83, 116, 97, 116, 117, 115, 68, 26, 173, 181, 241, 2, 149, 65, 209, 208, 66, 143, 234, 233, 157, 246, 118, 78, 238, 68, 44, 68, 42, 171, 20, 1, 68, 12, 30, 0, 130, 153, 49, 104, 116, 116, 112, 58, 47, 47, 101, 97, 102, 45, 112, 114, 111, 100, 46, 109, 101, 115, 46, 105, 110, 102, 105, 110, 101, 111, 110, 46, 99, 111, 109, 58, 57, 48, 48, 51, 47, 83, 116, 97, 116, 117, 115, 81, 117, 101, 114, 121, 1, 86, 14, 64, 16, 71, 101, 116, 70, 97, 99, 116, 111, 114, 121, 83, 116, 97, 116, 117, 115, 8, 19, 104, 116, 116, 112, 58, 47, 47, 116, 101, 109, 112, 117, 114, 105, 46, 111, 114, 103, 47, 64, 16, 105, 110, 99, 108, 117, 100, 101, 65, 103, 101, 110, 116, 76, 105, 115, 116, 135, 64, 17, 105, 110, 99, 108, 117, 100, 101, 83, 116, 97, 116, 117, 115, 76, 105, 115, 116, 135, 64, 23, 101, 120, 116, 101, 110, 100, 101, 100, 83, 116, 97, 116, 117, 115, 67, 101, 108, 108, 78, 97, 109, 101, 115, 9, 1, 98, 57, 104, 116, 116, 112, 58, 47, 47, 115, 99, 104, 101, 109, 97, 115, 46, 109, 105, 99, 114, 111, 115, 111, 102, 116, 46, 99, 111, 109, 47, 50, 48, 48, 51, 47, 49, 48, 47, 83, 101, 114, 105, 97, 108, 105, 122, 97, 116, 105, 111, 110, 47, 65, 114, 114, 97, 121, 115, 9, 1, 105, 41, 104, 116, 116, 112, 58, 47, 47, 119, 119, 119, 46, 119, 51, 46, 111, 114, 103, 47, 50, 48, 48, 49, 47, 88, 77, 76, 83, 99, 104, 101, 109, 97, 45, 105, 110, 115, 116, 97, 110, 99, 101, 95, 6, 115, 116, 114, 105, 110, 103, 153, 20, 66, 73, 79, 82, 65, 68, 53, 95, 70, 105, 108, 101, 65, 114, 99, 104, 105, 118, 101, 114, 1, 1, 1, 1];
foreach (string url in urls) {
if (stop) {
break;
}
segments = url.Split(':');
host = segments[0];
if (segments.Length == 0 || !int.TryParse(segments[1], out int port)) {
port = 80;
}
webClient.Headers.Clear();
webClient.Headers.Add("Accept-Encoding: gzip, deflate");
webClient.Headers.Add("Content-Type: application/soap+msbin1");
responseBytes = webClient.UploadData($"http://{host}:{port}/StatusQuery", bodyBytes);
// File.WriteAllText(@"L:\Tmp\a.txt", BitConverter.ToString(responseBytes));
response = Encoding.UTF8.GetString(responseBytes);
foreach (char c in response) {
unicodeCategory = CharUnicodeInfo.GetUnicodeCategory(c);
if (unicodeCategory == UnicodeCategory.Control && unicodeReplaces.ContainsKey(c)) {
_ = stringBuilder.Append(unicodeReplaces[c]);
} else if (unicodeCategories.Contains(unicodeCategory)) {
_ = stringBuilder.Append(c);
}
}
responseAfter = stringBuilder.ToString();
cellNames = responseAfter.Split(new string[] { "CellName" }, StringSplitOptions.None);
foreach (string segment in cellNames) {
if (stop) {
break;
}
key = string.Empty;
state = string.Empty;
segments = segment.Split(new string[] { "WindowsName" }, StringSplitOptions.None);
if (segments.Length != 2) {
continue;
}
text = segments[0];
segments = text.Replace('\r', ' ').Replace('\n', ' ').Split(' ');
for (int i = 0; i < segments.Length - 3; i++) {
if (stop) {
break;
}
if (!string.IsNullOrEmpty(segments[i]) && string.IsNullOrEmpty(key)) {
key = segments[i].Trim();
} else if (segments[i].StartsWith("State")) {
state = segments[i + 1];
break;
}
}
if (key.EndsWith("a")) {
key = key[..^1];
}
if (!results.ContainsKey(key)) {
results.Add(key, new Record(Text: text, Host: host, Port: port, Segments: segments, StateContainsDisabled: state.Contains(disabled)));
} else if (results[key].StateContainsDisabled) {
results[key] = new Record(Text: text, Host: host, Port: port, Segments: segments, StateContainsDisabled: state.Contains(disabled));
}
}
}
return results;
}
private static ReadOnlyCollection<string> GetUrls(bool development, bool staging, bool production) {
List<string> results = [];
if (development) {
results.Add("eaf-dev.mes.infineon.com:9003");
}
if (staging) {
results.Add("eaf-staging.mes.infineon.com:9003");
}
if (production) {
results.Add("eaf-prod.mes.infineon.com:9003");
}
return results.AsReadOnly();
}
private static List<UnicodeCategory> GetUnicodeCategory() {
List<UnicodeCategory> unicodeCategories = [
// UnicodeCategory.Control, // 33 - <20>
UnicodeCategory.UppercaseLetter, // 25 - ABCDEFGHIJKLMNOPQRSTUVWXY
UnicodeCategory.LowercaseLetter, // 25 - abcdefghiklmnopqrstuvwxyz
UnicodeCategory.DecimalDigitNumber, // 10 - 0123456789
UnicodeCategory.OtherPunctuation, // 10 - !"#%&,./:@
UnicodeCategory.ClosePunctuation, // 2 - )]
UnicodeCategory.MathSymbol, // 2 - |؈
UnicodeCategory.OpenPunctuation, // 2 - ([
// UnicodeCategory.OtherSymbol, // 1 - <20>
UnicodeCategory.DashPunctuation, // 1 - -
UnicodeCategory.ConnectorPunctuation, // 1 - _
UnicodeCategory.ModifierSymbol, // 1 - `
UnicodeCategory.NonSpacingMark, // 1 - ̵
UnicodeCategory.SpaceSeparator, // 1 -
UnicodeCategory.CurrencySymbol, // 1 - $
];
return unicodeCategories;
}
private static void EquipmentAutomationFrameworkCellInstanceParseCheck() {
Envelope? envelope;
string xmlStart621 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/CreateSequenceResponse</a:Action><a:RelatesTo>urn:uuid:6eb7a538-0b2b-4d04-8f2a-ab50e1e5338a</a:RelatesTo></s:Header><s:Body><CreateSequenceResponse xmlns=\"http://schemas.xmlsoap.org/ws/2005/02/rm\"><Identifier>urn:uuid:31c290af-2312-4b00-a57c-d5e1ab51e02a</Identifier><Accept><AcksTo><a:Address>http://eaf-prod.mes.infineon.com:9003/CellControllerManager</a:Address></AcksTo></Accept></CreateSequenceResponse></s:Body></s:Envelope>";
string xmlStart891 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:Sequence s:mustUnderstand=\"1\"><r:Identifier>urn:uuid:f169e50f-5ca8-43cd-a1e9-724840ff5e00</r:Identifier><r:MessageNumber>1</r:MessageNumber></r:Sequence><r:SequenceAcknowledgement><r:Identifier>urn:uuid:31c290af-2312-4b00-a57c-d5e1ab51e02a</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"1\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://tempuri.org/ICellControllerManager/StartAllCellInstancesResponse</a:Action><a:RelatesTo>urn:uuid:38977fa4-262a-42fb-8df7-d8d3074820b2</a:RelatesTo></s:Header><s:Body><StartAllCellInstancesResponse xmlns=\"http://tempuri.org/\"/></s:Body></s:Envelope>";
string xmlStart748 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:Sequence s:mustUnderstand=\"1\"><r:Identifier>urn:uuid:f169e50f-5ca8-43cd-a1e9-724840ff5e00</r:Identifier><r:MessageNumber>2</r:MessageNumber><r:LastMessage/></r:Sequence><r:SequenceAcknowledgement><r:Identifier>urn:uuid:31c290af-2312-4b00-a57c-d5e1ab51e02a</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"2\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/LastMessage</a:Action></s:Header><s:Body/></s:Envelope>";
string xmlStart707 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:SequenceAcknowledgement><r:Identifier>urn:uuid:31c290af-2312-4b00-a57c-d5e1ab51e02a</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"2\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/TerminateSequence</a:Action></s:Header><s:Body><r:TerminateSequence><r:Identifier>urn:uuid:f169e50f-5ca8-43cd-a1e9-724840ff5e00</r:Identifier></r:TerminateSequence></s:Body></s:Envelope>";
string xmlStop621 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/CreateSequenceResponse</a:Action><a:RelatesTo>urn:uuid:97f7aeb4-015f-440b-b0ff-a2a5aa4f4ab9</a:RelatesTo></s:Header><s:Body><CreateSequenceResponse xmlns=\"http://schemas.xmlsoap.org/ws/2005/02/rm\"><Identifier>urn:uuid:e34d16ad-21d5-4a11-a6dc-5b5b58a74f96</Identifier><Accept><AcksTo><a:Address>http://eaf-prod.mes.infineon.com:9003/CellControllerManager</a:Address></AcksTo></Accept></CreateSequenceResponse></s:Body></s:Envelope>";
string xmlStop889 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:Sequence s:mustUnderstand=\"1\"><r:Identifier>urn:uuid:c9a4d5b6-435b-49a4-a2f9-d93cd8aecc36</r:Identifier><r:MessageNumber>1</r:MessageNumber></r:Sequence><r:SequenceAcknowledgement><r:Identifier>urn:uuid:e34d16ad-21d5-4a11-a6dc-5b5b58a74f96</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"1\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://tempuri.org/ICellControllerManager/StopAllCellInstancesResponse</a:Action><a:RelatesTo>urn:uuid:04b8b0ea-8576-4756-b456-8a817cd10826</a:RelatesTo></s:Header><s:Body><StopAllCellInstancesResponse xmlns=\"http://tempuri.org/\"/></s:Body></s:Envelope>";
string xmlStop748 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:Sequence s:mustUnderstand=\"1\"><r:Identifier>urn:uuid:c9a4d5b6-435b-49a4-a2f9-d93cd8aecc36</r:Identifier><r:MessageNumber>2</r:MessageNumber><r:LastMessage/></r:Sequence><r:SequenceAcknowledgement><r:Identifier>urn:uuid:e34d16ad-21d5-4a11-a6dc-5b5b58a74f96</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"2\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/LastMessage</a:Action></s:Header><s:Body/></s:Envelope>";
string xmlStop707 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:SequenceAcknowledgement><r:Identifier>urn:uuid:e34d16ad-21d5-4a11-a6dc-5b5b58a74f96</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"2\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/TerminateSequence</a:Action></s:Header><s:Body><r:TerminateSequence><r:Identifier>urn:uuid:c9a4d5b6-435b-49a4-a2f9-d93cd8aecc36</r:Identifier></r:TerminateSequence></s:Body></s:Envelope>";
string xmlRestart621 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/CreateSequenceResponse</a:Action><a:RelatesTo>urn:uuid:e228a621-e7ab-4ebf-97ba-5571cb5f4ad7</a:RelatesTo></s:Header><s:Body><CreateSequenceResponse xmlns=\"http://schemas.xmlsoap.org/ws/2005/02/rm\"><Identifier>urn:uuid:a1650ed7-34dc-4fac-993f-ed2559c453a2</Identifier><Accept><AcksTo><a:Address>http://eaf-prod.mes.infineon.com:9003/CellControllerManager</a:Address></AcksTo></Accept></CreateSequenceResponse></s:Body></s:Envelope>";
string xmlRestart895 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:Sequence s:mustUnderstand=\"1\"><r:Identifier>urn:uuid:50c82506-bd4d-4117-b632-640cf84d556e</r:Identifier><r:MessageNumber>1</r:MessageNumber></r:Sequence><r:SequenceAcknowledgement><r:Identifier>urn:uuid:a1650ed7-34dc-4fac-993f-ed2559c453a2</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"1\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://tempuri.org/ICellControllerManager/RestartAllCellInstancesResponse</a:Action><a:RelatesTo>urn:uuid:efaeaf12-4aa0-4cd1-8296-05019e47261a</a:RelatesTo></s:Header><s:Body><RestartAllCellInstancesResponse xmlns=\"http://tempuri.org/\"/></s:Body></s:Envelope>";
string xmlRestart748 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:Sequence s:mustUnderstand=\"1\"><r:Identifier>urn:uuid:50c82506-bd4d-4117-b632-640cf84d556e</r:Identifier><r:MessageNumber>2</r:MessageNumber><r:LastMessage/></r:Sequence><r:SequenceAcknowledgement><r:Identifier>urn:uuid:a1650ed7-34dc-4fac-993f-ed2559c453a2</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"2\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/LastMessage</a:Action></s:Header><s:Body/></s:Envelope>";
string xmlRestart707 = "<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\"><s:Header><r:SequenceAcknowledgement><r:Identifier>urn:uuid:a1650ed7-34dc-4fac-993f-ed2559c453a2</r:Identifier><r:AcknowledgementRange Lower=\"1\" Upper=\"2\"/><netrm:BufferRemaining xmlns:netrm=\"http://schemas.microsoft.com/ws/2006/05/rm\">8</netrm:BufferRemaining></r:SequenceAcknowledgement><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/TerminateSequence</a:Action></s:Header><s:Body><r:TerminateSequence><r:Identifier>urn:uuid:50c82506-bd4d-4117-b632-640cf84d556e</r:Identifier></r:TerminateSequence></s:Body></s:Envelope>";
string[] xmlSets = [xmlStart621, xmlStart891, xmlStart748, xmlStart707, xmlStop621, xmlStop889, xmlStop748, xmlStop707, xmlRestart621, xmlRestart895, xmlRestart748, xmlRestart707];
foreach (string xmlSet in xmlSets) {
envelope = ParseXML<Envelope>(xmlSet, throwExceptions: true);
}
}
private static T? ParseXML<T>(string value, bool throwExceptions) where T : class {
object? result = null;
try {
Stream stream = ToStream(value.Trim());
XmlReader xmlReader = XmlReader.Create(stream, new XmlReaderSettings() { ConformanceLevel = ConformanceLevel.Document });
#pragma warning disable IL2026, IL2090
XmlSerializer xmlSerializer = new(typeof(T), typeof(T).GetNestedTypes());
result = xmlSerializer.Deserialize(xmlReader);
#pragma warning restore IL2026, IL2090
stream.Dispose();
} catch (Exception) {
if (throwExceptions) {
throw;
}
}
return result as T;
}
private static Stream ToStream(string value) {
MemoryStream memoryStream = new();
StreamWriter streamWriter = new(memoryStream);
streamWriter.Write(value);
streamWriter.Flush();
memoryStream.Position = 0;
return memoryStream;
}
private static Dictionary<char, char> GetUnicodeReplaces() {
Dictionary<char, char> results = new() {
{ '\u0000', ' ' },
{ '\u0001', ' ' },
{ '\u0002', ' ' },
{ '\u0003', ' ' },
{ '\u0004', ' ' },
{ '\u0005', ' ' },
{ '\u0006', ' ' },
{ '\u0007', ' ' },
{ '\u0008', ' ' },
{ '\u0009', '\t' },
{ '\u000A', '\r' },
{ '\u000B', '\r' },
{ '\u000C', '\t' },
{ '\u000D', '\r' },
{ '\u000E', ' ' },
{ '\u000F', ' ' },
{ '\u0010', ' ' },
{ '\u0011', ' ' },
{ '\u0012', ' ' },
{ '\u0013', ' ' },
{ '\u0014', ' ' },
{ '\u0015', ' ' },
{ '\u0016', ' ' },
{ '\u0017', ' ' },
{ '\u0018', ' ' },
{ '\u0019', ' ' },
{ '\u001A', ' ' },
{ '\u001B', ' ' },
{ '\u001C', '\r' },
{ '\u001D', '\t' },
{ '\u001E', '\t' },
{ '\u001F', '\t' },
{ '\u007F', ' ' },
// C1
{ '\u0080', '\t' },
{ '\u0081', ' ' },
{ '\u0082', ' ' },
{ '\u0083', ' ' },
{ '\u0084', ' ' },
{ '\u0085', '\r' },
{ '\u0086', ' ' },
{ '\u0087', ' ' },
{ '\u0088', '\t' },
{ '\u0089', '\t' },
{ '\u008A', '\t' },
{ '\u008B', '\r' },
{ '\u008C', ' ' },
{ '\u008D', ' ' },
{ '\u008E', ' ' },
{ '\u008F', ' ' },
{ '\u0090', ' ' },
{ '\u0091', ' ' },
{ '\u0092', ' ' },
{ '\u0093', ' ' },
{ '\u0094', ' ' },
{ '\u0095', ' ' },
{ '\u0096', ' ' },
{ '\u0097', ' ' },
{ '\u0098', ' ' },
{ '\u0099', ' ' },
{ '\u009A', ' ' },
{ '\u009B', ' ' },
{ '\u009C', ' ' },
{ '\u009D', ' ' },
{ '\u009E', ' ' },
{ '\u009F', ' ' }
};
return results;
}
private static Status EquipmentAutomationFrameworkCellInstanceStatus(string cellInstanceName, Record record) {
Status result;
bool stop = false;
string state = string.Empty;
string stopTime = string.Empty;
string startTime = string.Empty;
string startable = string.Empty;
string currentHost = string.Empty;
string errorDescription = string.Empty;
string isReadyForRestart = string.Empty;
string communicationState = string.Empty;
string currentActiveVersion = string.Empty;
for (int i = 0; i < record.Segments.Length - 3; i++) {
if (stop) {
break;
}
if (string.IsNullOrEmpty(state) && record.Segments[i].StartsWith("State")) {
state = record.Segments[i + 1];
} else if (string.IsNullOrEmpty(startable) && record.Segments[i].Contains("Startable")) {
startable = record.Segments[i + 1];
} else if (string.IsNullOrEmpty(stopTime) && record.Segments[i].StartsWith("StopTime")) {
stopTime = record.Segments[i + 1];
} else if (string.IsNullOrEmpty(currentHost) && record.Segments[i].Contains("CurrentHost")) {
currentHost = $"{record.Segments[i]} {record.Segments[i + 1]} {record.Segments[i + 2]}";
} else if (string.IsNullOrEmpty(errorDescription) && record.Segments[i].StartsWith("ErrorDescription")) {
errorDescription = record.Segments[i + 1];
} else if (string.IsNullOrEmpty(communicationState) && record.Segments[i].StartsWith("CommunicationState")) {
communicationState = record.Segments[i + 1];
} else if (string.IsNullOrEmpty(isReadyForRestart) && record.Segments[i].StartsWith("IsReadyForRestart")) {
isReadyForRestart = record.Segments[i + 1];
} else if (string.IsNullOrEmpty(currentActiveVersion) && record.Segments[i].Contains("CurrentActiveVersion")) {
currentActiveVersion = record.Segments[i + 1];
} else if (string.IsNullOrEmpty(startTime) && record.Segments[i].Contains("StartTime")) {
startTime = $"{record.Segments[i + 1]} {record.Segments[i + 2]} {record.Segments[i + 3]}".Split('\t')[0];
}
}
if (errorDescription != "a") {
string[] segments = record.Text.Split(new string[] { "ErrorDescription" }, StringSplitOptions.RemoveEmptyEntries);
if (segments.Length > 1) {
segments = segments[1].Split(new string[] { "Info" }, StringSplitOptions.RemoveEmptyEntries);
errorDescription = segments[0].Trim();
}
}
string nPort;
Dictionary<string, string> nPorts = GetnPorts();
if (!nPorts.ContainsKey(cellInstanceName)) {
nPort = string.Empty;
} else {
nPort = nPorts[cellInstanceName];
}
if (state.EndsWith("a")) {
state = state[0..^1];
}
if (state == "Running" && communicationState == "Not") {
state = "Warning";
}
result = new(Host: record.Host,
Port: record.Port,
Text: record.Text,
NPort: nPort,
State: state,
CellInstanceName: cellInstanceName,
StopTime: stopTime,
StartTime: startTime,
Startable: startable,
CurrentHost: currentHost,
ErrorDescription: errorDescription,
IsReadyForRestart: isReadyForRestart,
CommunicationState: communicationState,
CurrentActiveVersion: currentActiveVersion);
return result;
}
private static Dictionary<string, string> GetnPorts() {
Dictionary<string, string> results = new() {
{ "TENCOR1", "10.95.192.31" },
{ "TENCOR2", "10.95.192.32" },
{ "TENCOR3", "10.95.192.33" },
{ "HGCV1", "10.95.192.34" },
{ "HGCV2", "10.95.154.17" },
{ "HGCV3", "10.95.192.36" },
{ "BIORAD2", "10.95.192.37" },
{ "BIORAD3", "10.95.192.38" },
{ "CDE2", "10.95.192.39" },
{ "CDE3", "10.95.154.19" },
{ "CDE5", "10.95.192.40" },
{ "SPARE-1", "10.95.192.47" },
{ "SPARE-2", "10.95.192.48" },
{ "SPARE-3", "10.95.192.49" },
{ "SPARE-4", "10.95.192.50" },
{ "SPARE-5", "10.95.192.51" },
};
return results;
}
private static void EquipmentAutomationFrameworkCellInstanceRestart(string host, int port, string cellName = "R71-HSMS", string verbBy = @"EC\EcMesEaf") {
EquipmentAutomationFrameworkCellInstanceParseCheck();
// Restart
// <s:Envelope xmlns:s="http://www.w3.org/2003/05/soap-envelope" xmlns:a="http://www.w3.org/2005/08/addressing"><s:Header><a:Action s:mustUnderstand="1">http://schemas.xmlsoap.org/ws/2005/02/rm/CreateSequence</a:Action><a:MessageID>urn:uuid:09fd9303-dcfe-4563-803b-678441b12949</a:MessageID><a:To s:mustUnderstand="1">http://eaf-prod.mes.infineon.com:9003/CellControllerManager</a:To></s:Header><s:Body><CreateSequence xmlns="http://schemas.xmlsoap.org/ws/2005/02/rm"><AcksTo><a:Address>http://www.w3.org/2005/08/addressing/anonymous</a:Address></AcksTo><Offer><Identifier>urn:uuid:4f2050da-4287-411b-992f-3126a5b3b35b</Identifier></Offer></CreateSequence></s:Body></s:Envelope>
// <s:Envelope xmlns:s="http://www.w3.org/2003/05/soap-envelope" xmlns:r="http://schemas.xmlsoap.org/ws/2005/02/rm" xmlns:a="http://www.w3.org/2005/08/addressing"><s:Header><r:Sequence s:mustUnderstand="1"><r:Identifier>urn:uuid:fbf34c20-f381-4e82-b81f-b4c1809f14fa</r:Identifier><r:MessageNumber>1</r:MessageNumber></r:Sequence><a:Action s:mustUnderstand="1">http://tempuri.org/ICellControllerManager/RestartAllCellInstances</a:Action><a:MessageID>urn:uuid:c9f80db4-a2c2-4e53-9bed-8ba3a60b653c</a:MessageID><a:ReplyTo><a:Address>http://www.w3.org/2005/08/addressing/anonymous</a:Address></a:ReplyTo><a:To s:mustUnderstand="1">http://eaf-prod.mes.infineon.com:9003/CellControllerManager</a:To></s:Header><s:Body><RestartAllCellInstances xmlns="http://tempuri.org/"><cellInstances xmlns:b="http://schemas.microsoft.com/2003/10/Serialization/Arrays" xmlns:i="http://www.w3.org/2001/XMLSchema-instance"><b:string>SP101_FileArchiver</b:string></cellInstances><updateInfo>Restarted by EC\ecphares</updateInfo></RestartAllCellInstances></s:Body></s:Envelope>
// <s:Envelope xmlns:s="http://www.w3.org/2003/05/soap-envelope" xmlns:r="http://schemas.xmlsoap.org/ws/2005/02/rm" xmlns:a="http://www.w3.org/2005/08/addressing"><s:Header><r:SequenceAcknowledgement><r:Identifier>urn:uuid:4f2050da-4287-411b-992f-3126a5b3b35b</r:Identifier><r:AcknowledgementRange Lower="1" Upper="1"/></r:SequenceAcknowledgement><r:Sequence s:mustUnderstand="1"><r:Identifier>urn:uuid:fbf34c20-f381-4e82-b81f-b4c1809f14fa</r:Identifier><r:MessageNumber>2</r:MessageNumber><r:LastMessage/></r:Sequence><a:Action s:mustUnderstand="1">http://schemas.xmlsoap.org/ws/2005/02/rm/LastMessage</a:Action><a:To s:mustUnderstand="1">http://eaf-prod.mes.infineon.com:9003/CellControllerManager</a:To></s:Header><s:Body/></s:Envelope>
// <s:Envelope xmlns:s="http://www.w3.org/2003/05/soap-envelope" xmlns:r="http://schemas.xmlsoap.org/ws/2005/02/rm" xmlns:a="http://www.w3.org/2005/08/addressing"><s:Header><r:SequenceAcknowledgement><r:Identifier>urn:uuid:4f2050da-4287-411b-992f-3126a5b3b35b</r:Identifier><r:AcknowledgementRange Lower="1" Upper="2"/></r:SequenceAcknowledgement><a:Action s:mustUnderstand="1">http://schemas.xmlsoap.org/ws/2005/02/rm/TerminateSequence</a:Action><a:MessageID>urn:uuid:3b063df5-e6df-47a5-b530-aa380a4c6a38</a:MessageID><a:To s:mustUnderstand="1">http://eaf-prod.mes.infineon.com:9003/CellControllerManager</a:To></s:Header><s:Body><r:TerminateSequence><r:Identifier>urn:uuid:fbf34c20-f381-4e82-b81f-b4c1809f14fa</r:Identifier></r:TerminateSequence></s:Body></s:Envelope>
EquipmentAutomationFrameworkCellInstanceVerb(host, port, cellName, verbBy, restart: true, stop: false, start: false);
}
private static void EquipmentAutomationFrameworkCellInstanceStart(string host, int port, string cellName = "R71-HSMS", string verbBy = @"EC\EcMesEaf") {
EquipmentAutomationFrameworkCellInstanceParseCheck();
// Start
// <s:Envelope xmlns:s="http://www.w3.org/2003/05/soap-envelope" xmlns:a="http://www.w3.org/2005/08/addressing"><s:Header><a:Action s:mustUnderstand="1">http://schemas.xmlsoap.org/ws/2005/02/rm/CreateSequence</a:Action><a:MessageID>urn:uuid:a1188d61-df04-4955-b1e4-b90faff95d4d</a:MessageID><a:To s:mustUnderstand="1">http://eaf-prod.mes.infineon.com:9003/CellControllerManager</a:To></s:Header><s:Body><CreateSequence xmlns="http://schemas.xmlsoap.org/ws/2005/02/rm"><AcksTo><a:Address>http://www.w3.org/2005/08/addressing/anonymous</a:Address></AcksTo><Offer><Identifier>urn:uuid:35310d6d-3d17-419c-9b4f-cf20b705e5c9</Identifier></Offer></CreateSequence></s:Body></s:Envelope>
// <s:Envelope xmlns:s="http://www.w3.org/2003/05/soap-envelope" xmlns:r="http://schemas.xmlsoap.org/ws/2005/02/rm" xmlns:a="http://www.w3.org/2005/08/addressing"><s:Header><r:Sequence s:mustUnderstand="1"><r:Identifier>urn:uuid:739e01d3-5573-48a4-8bbb-53e2dfc344af</r:Identifier><r:MessageNumber>1</r:MessageNumber></r:Sequence><a:Action s:mustUnderstand="1">http://tempuri.org/ICellControllerManager/StartAllCellInstances</a:Action><a:MessageID>urn:uuid:8758eec2-b4dc-4338-ba3d-235aa15e634c</a:MessageID><a:ReplyTo><a:Address>http://www.w3.org/2005/08/addressing/anonymous</a:Address></a:ReplyTo><a:To s:mustUnderstand="1">http://eaf-prod.mes.infineon.com:9003/CellControllerManager</a:To></s:Header><s:Body><StartAllCellInstances xmlns="http://tempuri.org/"><cellInstances xmlns:b="http://schemas.microsoft.com/2003/10/Serialization/Arrays" xmlns:i="http://www.w3.org/2001/XMLSchema-instance"><b:string>SP101_FileArchiver</b:string></cellInstances><updateInfo>Started by EC\ecphares</updateInfo></StartAllCellInstances></s:Body></s:Envelope>
// <s:Envelope xmlns:s="http://www.w3.org/2003/05/soap-envelope" xmlns:r="http://schemas.xmlsoap.org/ws/2005/02/rm" xmlns:a="http://www.w3.org/2005/08/addressing"><s:Header><r:Sequence s:mustUnderstand="1"><r:Identifier>urn:uuid:739e01d3-5573-48a4-8bbb-53e2dfc344af</r:Identifier><r:MessageNumber>1</r:MessageNumber></r:Sequence><a:Action s:mustUnderstand="1">http://tempuri.org/ICellControllerManager/StartAllCellInstances</a:Action><a:MessageID>urn:uuid:8758eec2-b4dc-4338-ba3d-235aa15e634c</a:MessageID><a:ReplyTo><a:Address>http://www.w3.org/2005/08/addressing/anonymous</a:Address></a:ReplyTo><a:To s:mustUnderstand="1">http://eaf-prod.mes.infineon.com:9003/CellControllerManager</a:To></s:Header><s:Body><StartAllCellInstances xmlns="http://tempuri.org/"><cellInstances xmlns:b="http://schemas.microsoft.com/2003/10/Serialization/Arrays" xmlns:i="http://www.w3.org/2001/XMLSchema-instance"><b:string>SP101_FileArchiver</b:string></cellInstances><updateInfo>Started by EC\ecphares</updateInfo></StartAllCellInstances></s:Body></s:Envelope>
// <s:Envelope xmlns:s="http://www.w3.org/2003/05/soap-envelope" xmlns:r="http://schemas.xmlsoap.org/ws/2005/02/rm" xmlns:a="http://www.w3.org/2005/08/addressing"><s:Header><r:SequenceAcknowledgement><r:Identifier>urn:uuid:35310d6d-3d17-419c-9b4f-cf20b705e5c9</r:Identifier><r:AcknowledgementRange Lower="1" Upper="1"/></r:SequenceAcknowledgement><r:Sequence s:mustUnderstand="1"><r:Identifier>urn:uuid:739e01d3-5573-48a4-8bbb-53e2dfc344af</r:Identifier><r:MessageNumber>2</r:MessageNumber><r:LastMessage/></r:Sequence><a:Action s:mustUnderstand="1">http://schemas.xmlsoap.org/ws/2005/02/rm/LastMessage</a:Action><a:To s:mustUnderstand="1">http://eaf-prod.mes.infineon.com:9003/CellControllerManager</a:To></s:Header><s:Body/></s:Envelope>
// <s:Envelope xmlns:s="http://www.w3.org/2003/05/soap-envelope" xmlns:r="http://schemas.xmlsoap.org/ws/2005/02/rm" xmlns:a="http://www.w3.org/2005/08/addressing"><s:Header><r:SequenceAcknowledgement><r:Identifier>urn:uuid:35310d6d-3d17-419c-9b4f-cf20b705e5c9</r:Identifier><r:AcknowledgementRange Lower="1" Upper="2"/></r:SequenceAcknowledgement><a:Action s:mustUnderstand="1">http://schemas.xmlsoap.org/ws/2005/02/rm/TerminateSequence</a:Action><a:MessageID>urn:uuid:b2bb5329-c846-4cd1-98a8-343136923702</a:MessageID><a:To s:mustUnderstand="1">http://eaf-prod.mes.infineon.com:9003/CellControllerManager</a:To></s:Header><s:Body><r:TerminateSequence><r:Identifier>urn:uuid:739e01d3-5573-48a4-8bbb-53e2dfc344af</r:Identifier></r:TerminateSequence></s:Body></s:Envelope>
EquipmentAutomationFrameworkCellInstanceVerb(host, port, cellName, verbBy, restart: false, stop: false, start: true);
}
private static void EquipmentAutomationFrameworkCellInstanceVerb(string host, int port, string cellName, string verbBy, bool restart, bool stop, bool start) {
#pragma warning disable SYSLIB0014
WebClient webClient = new();
#pragma warning restore SYSLIB0014
string xml;
string verb;
Envelope? envelope;
string updateInfoVerb;
StringBuilder stringBuilder = new();
string cellControllerManager = $"http://{host}:{port}/CellControllerManager";
if (!restart && !stop && !start) {
throw new Exception();
} else if (restart && stop && start) {
throw new Exception();
} else if (restart) {
updateInfoVerb = "Restarted";
verb = "RestartAllCellInstances";
} else if (stop) {
updateInfoVerb = "Stopped";
verb = "StopAllCellInstancesResponse";
} else if (start) {
updateInfoVerb = "Started";
verb = "StartAllCellInstances";
} else
throw new Exception();
string headerMessageID621 = Guid.NewGuid().ToString();
string bodyIdentifier621 = Guid.NewGuid().ToString();
_ = stringBuilder.Append("<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:a=\"http://www.w3.org/2005/08/addressing\">").
Append("<s:Header><a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/CreateSequence</a:Action>").
Append("<a:MessageID>urn:uuid:").Append(headerMessageID621).Append("</a:MessageID>").
Append("<a:To s:mustUnderstand=\"1\">").Append(cellControllerManager).Append("</a:To>").
Append("</s:Header><s:Body><CreateSequence xmlns=\"http://schemas.xmlsoap.org/ws/2005/02/rm\"><AcksTo>").
Append("<a:Address>http://www.w3.org/2005/08/addressing/anonymous</a:Address></AcksTo><Offer>").
Append("<Identifier>urn:uuid:").Append(bodyIdentifier621).Append("</Identifier>").
Append("</Offer></CreateSequence></s:Body></s:Envelope>");
xml = stringBuilder.ToString();
_ = stringBuilder.Clear();
webClient.Headers.Clear();
webClient.Headers.Add("Accept-Encoding: gzip, deflate");
webClient.Headers.Add("Content-Type: application/soap+xml; charset=utf-8");
xml = webClient.UploadString(cellControllerManager, xml);
envelope = ParseXML<Envelope>(xml, throwExceptions: true);
if (envelope is null || envelope.Body is null || envelope.Body.CreateSequenceResponse is null || string.IsNullOrEmpty(envelope.Body.CreateSequenceResponse.Identifier)) {
throw new Exception("Invalid reply! Example [urn:uuid:f1aa1fa8-9099-48b6-b27f-50e6c098605b]");
}
string headerSequenceIdentifier895 = envelope.Body.CreateSequenceResponse.Identifier["urn:uuid:".Length..];
string headerMessageID895 = Guid.NewGuid().ToString();
_ = stringBuilder.Append("<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\">").
Append("<s:Header><r:Sequence s:mustUnderstand=\"1\">").
Append("<r:Identifier>urn:uuid:").Append(headerSequenceIdentifier895).Append("</r:Identifier>").
Append("<r:MessageNumber>1</r:MessageNumber></r:Sequence>").
Append("<a:Action s:mustUnderstand=\"1\">http://tempuri.org/ICellControllerManager/").Append(verb).Append("</a:Action>").
Append("<a:MessageID>urn:uuid:").Append(headerMessageID895).Append("</a:MessageID>").
Append("<a:ReplyTo><a:Address>http://www.w3.org/2005/08/addressing/anonymous</a:Address></a:ReplyTo>").
Append("<a:To s:mustUnderstand=\"1\">").Append(cellControllerManager).Append("</a:To>").
Append("</s:Header><s:Body>").
Append('<').Append(verb).Append(" xmlns=\"http://tempuri.org/\">").
Append("<cellInstances xmlns:b=\"http://schemas.microsoft.com/2003/10/Serialization/Arrays\" xmlns:i=\"http://www.w3.org/2001/XMLSchema-instance\">").
Append("<b:string>").Append(cellName).Append("</b:string>").
Append("</cellInstances>").
Append("<updateInfo>").Append(updateInfoVerb).Append(" by ").Append(verbBy).Append("</updateInfo>").
Append("</").Append(verb).Append('>').
Append("</s:Body></s:Envelope>");
xml = stringBuilder.ToString();
_ = stringBuilder.Clear();
webClient.Headers.Clear();
webClient.Headers.Add("Accept-Encoding: gzip, deflate");
webClient.Headers.Add("Content-Type: application/soap+xml; charset=utf-8");
xml = webClient.UploadString(cellControllerManager, xml);
_ = ParseXML<Envelope>(xml, throwExceptions: true);
string headerSequenceAcknowledgementIdentifier748 = bodyIdentifier621;
string headerSequenceIdentifier748 = headerSequenceIdentifier895;
_ = stringBuilder.Append("<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\">").
Append("<s:Header><r:SequenceAcknowledgement>").
Append("<r:Identifier>urn:uuid:").Append(headerSequenceAcknowledgementIdentifier748).Append("</r:Identifier>").
Append("<r:AcknowledgementRange Lower=\"1\" Upper=\"1\"/></r:SequenceAcknowledgement><r:Sequence s:mustUnderstand=\"1\">").
Append("<r:Identifier>urn:uuid:").Append(headerSequenceIdentifier748).Append("</r:Identifier>").
Append("<r:MessageNumber>2</r:MessageNumber><r:LastMessage/></r:Sequence>").
Append("<a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/LastMessage</a:Action>").
Append("<a:To s:mustUnderstand=\"1\">").Append(cellControllerManager).Append("</a:To>").
Append("</s:Header><s:Body/></s:Envelope>");
xml = stringBuilder.ToString();
_ = stringBuilder.Clear();
webClient.Headers.Clear();
webClient.Headers.Add("Accept-Encoding: gzip, deflate");
webClient.Headers.Add("Content-Type: application/soap+xml; charset=utf-8");
xml = webClient.UploadString(cellControllerManager, xml);
_ = ParseXML<Envelope>(xml, throwExceptions: true);
string headerSequenceAcknowledgementIdentifier707 = bodyIdentifier621;
string headerMessageID707 = Guid.NewGuid().ToString();
string bodyTerminateSequenceIdentifier707 = headerSequenceIdentifier895;
_ = stringBuilder.Append("<s:Envelope xmlns:s=\"http://www.w3.org/2003/05/soap-envelope\" xmlns:r=\"http://schemas.xmlsoap.org/ws/2005/02/rm\" xmlns:a=\"http://www.w3.org/2005/08/addressing\">").
Append("<s:Header><r:SequenceAcknowledgement>").
Append("<r:Identifier>urn:uuid:").Append(headerSequenceAcknowledgementIdentifier707).Append("</r:Identifier>").
Append("<r:AcknowledgementRange Lower=\"1\" Upper=\"2\"/></r:SequenceAcknowledgement>").
Append("<a:Action s:mustUnderstand=\"1\">http://schemas.xmlsoap.org/ws/2005/02/rm/TerminateSequence</a:Action>").
Append("<a:MessageID>urn:uuid:").Append(headerMessageID707).Append("</a:MessageID>").
Append("<a:To s:mustUnderstand=\"1\">").Append(cellControllerManager).Append("</a:To>").
Append("</s:Header><s:Body>").
Append("<r:TerminateSequence>").
Append("<r:Identifier>urn:uuid:").Append(bodyTerminateSequenceIdentifier707).Append("</r:Identifier>").
Append("</r:TerminateSequence>").
Append("</s:Body></s:Envelope>");
xml = stringBuilder.ToString();
_ = stringBuilder.Clear();
webClient.Headers.Clear();
webClient.Headers.Add("Accept-Encoding: gzip, deflate");
webClient.Headers.Add("Content-Type: application/soap+xml; charset=utf-8");
xml = webClient.UploadString(cellControllerManager, xml);
_ = ParseXML<Envelope>(xml, throwExceptions: true);
}
}

View File

@ -0,0 +1,89 @@
using System.Collections.ObjectModel;
using Microsoft.Extensions.Logging;
namespace File_Folder_Helper.ADO2025.PI6;
internal static partial class Helper20250618 {
private record Record(string Directory, List<string> Files);
internal static void MoveAllButXOfEach(ILogger<Worker> logger, List<string> args) {
int keep = int.Parse(args[5]);
logger.LogInformation(args[0]);
logger.LogInformation(args[1]);
logger.LogInformation(args[2]);
string searchPattern = args[2];
string split = args[4].Split('~')[0];
string sourceDirectory = Path.GetFullPath(args[0].Split('~')[0]);
string destinationDirectory = Path.GetFullPath(args[3].Split('~')[0]);
if (destinationDirectory.Contains(sourceDirectory)) {
throw new Exception("Not allowed!");
}
ReadOnlyCollection<string> directories = GetDirectories(sourceDirectory);
MoveAllButXOfEachB(logger, searchPattern, sourceDirectory, destinationDirectory, split, keep, directories);
}
private static ReadOnlyCollection<string> GetDirectories(string sourceDirectory) {
List<string> results = [sourceDirectory];
results.AddRange(Directory.GetDirectories(sourceDirectory, "*", SearchOption.AllDirectories));
return results.AsReadOnly();
}
private static void MoveAllButXOfEachB(ILogger<Worker> logger, string searchPattern, string sourceDirectory, string destinationDirectory, string split, int keep, ReadOnlyCollection<string> directories) {
string[] files;
int sourceDirectoryLength = sourceDirectory.Length;
ReadOnlyDictionary<string, ReadOnlyCollection<string>> keyValuePairs;
foreach (string directory in directories) {
files = Directory.GetFiles(directory, searchPattern, SearchOption.TopDirectoryOnly);
keyValuePairs = GetFiles(split, files);
foreach (KeyValuePair<string, ReadOnlyCollection<string>> keyValuePair in keyValuePairs) {
if (keyValuePair.Value.Count <= keep) {
continue;
} else {
MoveAllButXOfEachC(logger, sourceDirectoryLength, destinationDirectory, keyValuePair);
}
}
}
}
private static ReadOnlyDictionary<string, ReadOnlyCollection<string>> GetFiles(string split, string[] files) {
Dictionary<string, ReadOnlyCollection<string>> results = [];
string key;
List<string>? collection;
Dictionary<string, List<string>> keyValuePairs = [];
foreach (string file in files) {
key = Path.GetFileName(file).Split(split)[0];
if (!keyValuePairs.TryGetValue(key, out collection)) {
keyValuePairs.Add(key, []);
if (!keyValuePairs.TryGetValue(key, out collection)) {
throw new Exception();
}
}
collection.Add(file);
}
foreach (KeyValuePair<string, List<string>> keyValuePair in keyValuePairs) {
results.Add(keyValuePair.Key, keyValuePair.Value.OrderByDescending(l => l).ToArray().AsReadOnly());
}
return results.AsReadOnly();
}
private static void MoveAllButXOfEachC(ILogger<Worker> logger, int sourceDirectoryLength, string destinationDirectory, KeyValuePair<string, ReadOnlyCollection<string>> keyValuePair) {
string file;
string checkFile;
string checkDirectory;
for (int i = 1; i < keyValuePair.Value.Count; i++) {
file = keyValuePair.Value[i];
checkFile = $"{destinationDirectory}{file[sourceDirectoryLength..]}";
checkDirectory = Path.GetDirectoryName(checkFile) ?? throw new Exception();
if (!Directory.Exists(checkDirectory))
_ = Directory.CreateDirectory(checkDirectory);
if (File.Exists(checkFile))
continue;
File.Move(file, checkFile);
logger.LogInformation("<{file}> moved", file);
}
}
}

View File

@ -1,206 +0,0 @@
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
namespace File_Folder_Helper.Day;
internal static partial class Helper20240623
{
private record SubTaskLine(string Text, bool Done, long? Ticks, int? Line);
private record Record(int? CodeInsidersLine, string File, string[] Lines, int? StopLine, int? SubTasksLine);
private static List<Record> GetRecords(string sourceDirectory, string searchPattern, string codeInsiders, string subTasks)
{
List<Record> results = [];
int? stopLine;
string[] lines;
int? subTasksLine;
int? codeInsidersLine;
string[] files = Directory.GetFiles(sourceDirectory, searchPattern, SearchOption.AllDirectories);
foreach (string file in files)
{
stopLine = null;
subTasksLine = null;
codeInsidersLine = null;
lines = File.ReadAllLines(file);
for (int i = 0; i < lines.Length; i++)
{
if (lines[i].StartsWith(codeInsiders) && lines[i][^1] == '"')
{
if (lines.Length > i + 1 && lines[i + 1] == "```")
codeInsidersLine = i;
}
if (lines[i] != subTasks)
continue;
subTasksLine = i;
if (codeInsidersLine is null)
break;
if (lines.Length > i)
{
for (int j = i + 1; j < lines.Length; j++)
{
if (lines[j].Length > 0 && lines[j][0] == '#')
{
stopLine = j;
break;
}
}
}
stopLine ??= lines.Length;
break;
}
results.Add(new(codeInsidersLine, file, lines, stopLine, subTasksLine));
}
return results;
}
private static ReadOnlyCollection<SubTaskLine> GetSubTasks(string subTasks, string[] tasks, bool? foundDone, string fallbackLine, FileInfo fileInfo)
{
List<SubTaskLine> results = [];
string line;
bool doneValue;
string? h1 = null;
bool foundSubTasks = false;
int tasksZeroLength = tasks[0].Length;
string[] lines = File.ReadAllLines(fileInfo.FullName);
for (int i = 0; i < lines.Length; i++)
{
line = lines[i];
if (line.StartsWith("# "))
h1 = line[2..];
if (!foundSubTasks && line == subTasks)
foundSubTasks = true;
if (!foundSubTasks)
continue;
if (line.Length <= tasksZeroLength || !line.StartsWith(tasks[0]) || line[tasksZeroLength] is not ' ' and not 'x' || line[tasksZeroLength + 1] != ']')
continue;
doneValue = foundDone is not null && foundDone.Value;
results.Add(new($" {line}", doneValue, fileInfo.LastWriteTime.Ticks, i));
}
doneValue = foundDone is not null && foundDone.Value;
if (h1 is null)
results.Add(new(fallbackLine, doneValue, fileInfo.LastWriteTime.Ticks, Line: null));
else
results.Add(new(foundDone is null || !foundDone.Value ? $"- [ ] {fileInfo.LastWriteTime.Ticks} ~~~ {h1}" : $"- [x] {fileInfo.LastWriteTime.Ticks} ~~~ {h1}", doneValue, fileInfo.LastWriteTime.Ticks, Line: 0));
return new(results);
}
internal static void UpdateSubTasksInMarkdownFiles(ILogger<Worker> logger, List<string> args)
{
int lineCheck;
bool doneValue;
bool? foundDone;
FileInfo fileInfo;
string[] newLines;
string[] segments;
List<string> lines;
string fallbackLine;
string[] indexLines;
string checkDirectory;
string done = args[7];
List<string> indexFiles;
string subTasks = args[3];
List<string> oldLines = [];
string indexFile = args[5];
string searchPattern = args[2];
string directoryFilter = args[8];
string[] tasks = args[6].Split(',');
string codeInsiders = $"{args[4]} \"";
List<SubTaskLine> allSubTaskLines = [];
ReadOnlyCollection<SubTaskLine> subTaskLines;
string sourceDirectory = Path.GetFullPath(args[0]);
List<Record> records = GetRecords(sourceDirectory, searchPattern, codeInsiders, subTasks);
foreach (Record record in from l in records orderby l.SubTasksLine is null, l.CodeInsidersLine is null select l)
{
if (record.SubTasksLine is null)
continue;
if (record.CodeInsidersLine is not null)
logger.LogInformation("<{file}> has [{subTasks}]", Path.GetFileNameWithoutExtension(record.File), subTasks);
else
{
logger.LogWarning("<{file}> has [{subTasks}] but doesn't have [{codeInsiders}]!", Path.GetFileNameWithoutExtension(record.File), subTasks, codeInsiders);
continue;
}
if (record.StopLine is null)
continue;
checkDirectory = record.Lines[record.CodeInsidersLine.Value][codeInsiders.Length..^1];
if (!Directory.Exists(checkDirectory))
{
logger.LogError("<{checkDirectory}> doesn't exist", Path.GetFileName(checkDirectory));
continue;
}
indexFiles = Directory.GetFiles(checkDirectory, indexFile, SearchOption.AllDirectories).ToList();
if (indexFiles.Count != 1)
{
for (int i = indexFiles.Count - 1; i > -1; i--)
{
if (!indexFiles[i].Contains(directoryFilter, StringComparison.CurrentCultureIgnoreCase))
indexFiles.RemoveAt(i);
}
if (indexFiles.Count != 1)
{
logger.LogError("<{checkDirectory}> doesn't have a [{indexFile}]", Path.GetFileName(checkDirectory), indexFile);
continue;
}
}
foundDone = null;
oldLines.Clear();
allSubTaskLines.Clear();
indexLines = File.ReadAllLines(indexFiles[0]);
checkDirectory = Path.GetDirectoryName(indexFiles[0]) ?? throw new Exception();
for (int i = 0; i < indexLines.Length; i++)
{
if (indexLines[i] == done)
foundDone = true;
segments = indexLines[i].Split(tasks[1]);
doneValue = foundDone is not null && foundDone.Value;
if (segments.Length > 2 || !segments[0].StartsWith(tasks[0]))
continue;
fallbackLine = foundDone is null || !foundDone.Value ? $"- [ ] {segments[0][tasks[0].Length..]}" : $"- [x] {segments[0][tasks[0].Length..]}";
fileInfo = new(Path.GetFullPath(Path.Combine(checkDirectory, segments[1][..^1])));
if (!fileInfo.Exists)
{
allSubTaskLines.Add(new(fallbackLine, doneValue, Ticks: null, Line: null));
continue;
}
subTaskLines = GetSubTasks(subTasks, tasks, doneValue, fallbackLine, fileInfo);
for (int j = subTaskLines.Count - 1; j >= 0; j--)
allSubTaskLines.Add(subTaskLines[j]);
}
if (allSubTaskLines.Count == 0)
continue;
lineCheck = 0;
for (int i = record.SubTasksLine.Value + 1; i < record.StopLine.Value - 1; i++)
oldLines.Add(record.Lines[i]);
if (allSubTaskLines.Any(l => l.Ticks is null))
newLines = (from l in allSubTaskLines select l.Text).ToArray();
else
newLines = (from l in allSubTaskLines orderby l.Done descending, l.Ticks, l.Line select l.Text).ToArray();
if (newLines.Length == oldLines.Count)
{
for (int i = 0; i < newLines.Length; i++)
{
if (newLines[i] != record.Lines[record.SubTasksLine.Value + 1 + i])
continue;
lineCheck++;
}
if (lineCheck == newLines.Length)
continue;
}
checkDirectory = Path.Combine(checkDirectory, DateTime.Now.Ticks.ToString());
_ = Directory.CreateDirectory(checkDirectory);
Thread.Sleep(500);
Directory.Delete(checkDirectory);
lines = record.Lines.ToList();
for (int i = record.StopLine.Value - 1; i > record.SubTasksLine.Value + 1; i--)
lines.RemoveAt(i);
if (record.StopLine.Value == record.Lines.Length && lines[^1].Length == 0)
lines.RemoveAt(lines.Count - 1);
for (int i = 0; i < newLines.Length; i++)
lines.Insert(record.SubTasksLine.Value + 1 + i, newLines[i]);
lines.Insert(record.SubTasksLine.Value + 1, string.Empty);
File.WriteAllLines(record.File, lines);
}
}
}

View File

@ -1,206 +0,0 @@
using File_Folder_Helper.Models;
using Microsoft.Extensions.Logging;
using System.Collections.ObjectModel;
using System.Diagnostics;
using System.Globalization;
using System.Text.Json;
namespace File_Folder_Helper.Day;
internal static partial class Helper20240724
{
private record FileConnectorConfigurationSystem(string AlternateTargetFolder,
string FileAgeThreshold,
string[] SourceFileFilters,
string TargetFileLocation);
#pragma warning disable IDE0028, IDE0056, IDE0300, IDE0240, IDE0241
private static readonly HttpClient _HttpClient = new();
private static readonly string _StaticFileServer = "localhost:5054";
private static readonly FileConnectorConfigurationSystem _FileConnectorConfiguration = new(
"D:/Tmp/Phares/AlternateTargetFolder",
"000:20:00:01",
[".txt"],
"D:/Tmp/Phares/TargetFileLocation");
private static DateTime GetFileAgeThresholdDateTime(string fileAgeThreshold)
{
DateTime result = DateTime.Now;
string[] segments = fileAgeThreshold.Split(':');
for (int i = 0; i < segments.Length; i++)
{
result = i switch
{
0 => result.AddDays(double.Parse(segments[i]) * -1),
1 => result.AddHours(double.Parse(segments[i]) * -1),
2 => result.AddMinutes(double.Parse(segments[i]) * -1),
3 => result.AddSeconds(double.Parse(segments[i]) * -1),
_ => throw new Exception(),
};
}
return result;
}
private static string[] GetValidWeeks(DateTime fileAgeThresholdDateTime)
{
DateTime dateTime = DateTime.Now;
Calendar calendar = new CultureInfo("en-US").Calendar;
string weekOfYear = $"{dateTime:yyyy}_Week_{calendar.GetWeekOfYear(dateTime, CalendarWeekRule.FirstDay, DayOfWeek.Sunday):00}";
string lastWeekOfYear = $"{fileAgeThresholdDateTime:yyyy}_Week_{calendar.GetWeekOfYear(fileAgeThresholdDateTime, CalendarWeekRule.FirstDay, DayOfWeek.Sunday):00}";
return new string[] { weekOfYear, lastWeekOfYear }.Distinct().ToArray();
}
private static string[] GetValidDays(DateTime fileAgeThresholdDateTime)
{
DateTime dateTime = DateTime.Now;
return new string[] { dateTime.ToString("yyyy-MM-dd"), fileAgeThresholdDateTime.ToString("yyyy-MM-dd") }.Distinct().ToArray();
}
private static ReadOnlyCollection<NginxFileSystem> GetDayNginxFileSystemCollection(DateTime fileAgeThresholdDateTime, string week, string day, string dayUrl, NginxFileSystem[] dayNginxFileSystemCollection)
{
List<NginxFileSystem> results = new();
DateTime dateTime;
string nginxFormat = "ddd, dd MMM yyyy HH:mm:ss zzz";
foreach (NginxFileSystem dayNginxFileSystem in dayNginxFileSystemCollection)
{
if (!DateTime.TryParseExact(dayNginxFileSystem.MTime.Replace("GMT", "+00:00"), nginxFormat, CultureInfo.InvariantCulture, DateTimeStyles.None, out dateTime))
continue;
if (dateTime < fileAgeThresholdDateTime)
continue;
results.Add(new(
Path.GetFullPath(Path.Combine(_FileConnectorConfiguration.TargetFileLocation, week, day, dayNginxFileSystem.Name)),
string.Concat(dayUrl, '/', dayNginxFileSystem.Name),
dateTime.ToString(),
dayNginxFileSystem.Size));
}
return results.AsReadOnly();
}
private static ReadOnlyCollection<NginxFileSystem> GetDayNginxFileSystemCollection(DateTime fileAgeThresholdDateTime)
{
#nullable enable
List<NginxFileSystem> results = new();
string dayUrl;
string dayJson;
string weekJson;
string checkWeek;
Task<HttpResponseMessage> task;
NginxFileSystem[]? dayNginxFileSystemCollection;
NginxFileSystem[]? weekNginxFileSystemCollection;
string[] days = GetValidDays(fileAgeThresholdDateTime);
string[] weeks = GetValidWeeks(fileAgeThresholdDateTime);
foreach (string week in weeks)
{
checkWeek = string.Concat("http://", _StaticFileServer, '/', week);
task = _HttpClient.GetAsync(checkWeek);
task.Wait();
if (!task.Result.IsSuccessStatusCode)
continue;
weekJson = _HttpClient.GetStringAsync(checkWeek).Result;
weekNginxFileSystemCollection = JsonSerializer.Deserialize(weekJson, NginxFileSystemCollectionSourceGenerationContext.Default.NginxFileSystemArray);
if (weekNginxFileSystemCollection is null)
continue;
foreach (NginxFileSystem weekNginxFileSystem in weekNginxFileSystemCollection)
{
if (!(from l in days where weekNginxFileSystem.Name == l select false).Any())
continue;
dayUrl = string.Concat(checkWeek, '/', weekNginxFileSystem.Name);
dayJson = _HttpClient.GetStringAsync(dayUrl).Result;
dayNginxFileSystemCollection = JsonSerializer.Deserialize(dayJson, NginxFileSystemCollectionSourceGenerationContext.Default.NginxFileSystemArray);
if (dayNginxFileSystemCollection is null)
continue;
results.AddRange(GetDayNginxFileSystemCollection(fileAgeThresholdDateTime, week, weekNginxFileSystem.Name, dayUrl, dayNginxFileSystemCollection));
}
}
return results.AsReadOnly();
#nullable disable
}
private static ReadOnlyCollection<Tuple<DateTime, FileInfo, FileInfo, string>> GetPossible()
{
List<Tuple<DateTime, FileInfo, FileInfo, string>> results = new();
DateTime dateTime;
FileInfo targetFileInfo;
FileInfo alternateFileInfo;
DateTime fileAgeThresholdDateTime = GetFileAgeThresholdDateTime(_FileConnectorConfiguration.FileAgeThreshold);
ReadOnlyCollection<NginxFileSystem> dayNginxFileSystemCollection = GetDayNginxFileSystemCollection(fileAgeThresholdDateTime);
foreach (NginxFileSystem nginxFileSystem in dayNginxFileSystemCollection)
{
targetFileInfo = new FileInfo(nginxFileSystem.Name);
if (targetFileInfo.Directory is null)
continue;
if (!Directory.Exists(targetFileInfo.Directory.FullName))
_ = Directory.CreateDirectory(targetFileInfo.Directory.FullName);
if (!DateTime.TryParse(nginxFileSystem.MTime, out dateTime))
continue;
if (targetFileInfo.Exists && targetFileInfo.LastWriteTime == dateTime)
continue;
alternateFileInfo = new(Path.Combine(_FileConnectorConfiguration.AlternateTargetFolder, nginxFileSystem.Name));
results.Add(new(dateTime, targetFileInfo, alternateFileInfo, nginxFileSystem.Type));
}
return (from l in results orderby l.Item1 select l).ToList().AsReadOnly();
}
private static void Test()
{
#nullable enable
if (_HttpClient is null)
throw new Exception();
if (string.IsNullOrEmpty(_StaticFileServer))
throw new Exception();
if (string.IsNullOrEmpty(_StaticFileServer))
{
ReadOnlyCollection<Tuple<DateTime, FileInfo, FileInfo, string>> possibleDownload = GetPossible();
if (possibleDownload.Count > 0)
{
string targetFileName = possibleDownload[0].Item4;
FileInfo targetFileInfo = possibleDownload[0].Item2;
FileInfo alternateFileInfo = possibleDownload[0].Item3;
DateTime matchNginxFileSystemDateTime = possibleDownload[0].Item1;
// if (alternateFileInfo.Exists)
// File.Delete(alternateFileInfo.FullName);
if (targetFileInfo.Exists)
File.Delete(targetFileInfo.FullName);
string targetJson = _HttpClient.GetStringAsync(targetFileName).Result;
File.WriteAllText(targetFileInfo.FullName, targetJson);
targetFileInfo.LastWriteTime = matchNginxFileSystemDateTime;
// File.Copy(targetFileInfo.FullName, alternateFileInfo.FullName);
File.AppendAllText(alternateFileInfo.FullName, targetJson);
}
}
#nullable disable
}
internal static void CopyDirectories(ILogger<Worker> logger, List<string> args)
{
Test();
string[] files;
Process process;
string checkDirectory;
string filter = args[3];
string replaceWith = args[4];
string searchPattern = args[2];
string sourceDirectory = Path.GetFullPath(args[0]);
string[] foundDirectories = Directory.GetDirectories(sourceDirectory, searchPattern, SearchOption.AllDirectories);
logger.LogInformation($"Found {foundDirectories.Length} directories");
foreach (string foundDirectory in foundDirectories)
{
if (!foundDirectory.Contains(filter))
continue;
logger.LogDebug(foundDirectory);
checkDirectory = foundDirectory.Replace(filter, replaceWith);
if (Directory.Exists(checkDirectory))
{
files = Directory.GetFiles(checkDirectory, "*", SearchOption.AllDirectories);
if (files.Length > 0)
continue;
Directory.Delete(checkDirectory);
}
process = Process.Start("cmd.exe", $"/c xCopy \"{foundDirectory}\" \"{checkDirectory}\" /S /E /I /H /Y");
process.WaitForExit();
}
}
}

View File

@ -10,83 +10,167 @@ internal static class HelperDay
{
logger.LogInformation("X) Day Helpers,");
if (args[1] == "Day-Helper-2023-09-06")
Day.Helper20230906.SaveJson(logger, args[0]);
Day.Q32023.Helper20230906.SaveJson(logger, args[0]);
else if (args[1] == "Day-Helper-2023-10-10")
Day.Helper20231010.HgCV(logger, args[0]);
Day.Q42023.Helper20231010.HgCV(logger, args[0]);
else if (args[1] == "Day-Helper-2023-10-16")
Day.Helper20231016.MoveDirectory(logger, args[0]);
Day.Q42023.Helper20231016.MoveDirectory(logger, args[0]);
else if (args[1] == "Day-Helper-2023-10-24")
Day.Helper20231024.NetUse(logger, args[0]);
Day.Q42023.Helper20231024.NetUse(logger, args[0]);
else if (args[1] == "Day-Helper-2023-11-02")
Day.Helper20231102.NuSpec(logger, args[0]);
Day.Q42023.Helper20231102.NuSpec(logger, args[0]);
else if (args[1] == "Day-Helper-2023-11-08")
Day.Helper20231108.MasterImage(logger, args);
Day.Q42023.Helper20231108.MasterImage(logger, args);
else if (args[1] == "Day-Helper-2023-11-22")
Day.Helper20231122.ProcessDataStandardFormat(logger, args);
Day.Q42023.Helper20231122.ProcessDataStandardFormat(logger, args);
else if (args[1] == "Day-Helper-2023-11-28")
logger.LogError("{arg} - has been migrated to File-Watcher", args[1]);
else if (args[1] == "Day-Helper-2023-11-30")
Day.Helper20231130.RenameReactorProcessDataStandardFormatFiles(logger, args);
Day.Q42023.Helper20231130.RenameReactorProcessDataStandardFormatFiles(logger, args);
else if (args[1] == "Day-Helper-2023-12-05")
Day.Helper20231205.SplitMarkdownFile(logger, args);
Day.Q42023.Helper20231205.SplitMarkdownFile(logger, args);
else if (args[1] == "Day-Helper-2023-12-12")
logger.LogError("{arg} - was deleted on 2024-04-08", args[1]);
else if (args[1] == "Day-Helper-2023-12-22")
Day.Helper20231222.ConvertId(logger, args);
Day.Q42023.Helper20231222.ConvertId(logger, args);
else if (args[1] == "Day-Helper-2024-01-05")
Day.Helper20240105.ConvertKeePassExport(logger, args);
ADO2024.PI1.Helper20240105.ConvertKeePassExport(logger, args);
else if (args[1] == "Day-Helper-2024-01-06")
Day.Helper20240106.TextToJson(logger, args);
ADO2024.PI1.Helper20240106.TextToJson(logger, args);
else if (args[1] == "Day-Helper-2024-01-07")
Day.Helper20240107.DirectoryToISO(logger, args);
ADO2024.PI1.Helper20240107.DirectoryToISO(logger, args);
else if (args[1] == "Day-Helper-2024-01-08")
Day.Helper20240108.SortCodeMethods(logger, args, cancellationToken);
ADO2024.PI1.Helper20240108.SortCodeMethods(logger, args, cancellationToken);
else if (args[1] == "Day-Helper-2024-01-27")
logger.LogError("{arg} - has been migrated to Clipboard_Send_Keys", args[1]);
else if (args[1] == "Day-Helper-2024-01-29")
Day.Helper20240129.JsonToTsv(logger, args);
ADO2024.PI1.Helper20240129.JsonToTsv(logger, args);
else if (args[1] == "Day-Helper-2024-03-05")
Day.Helper20240305.ArchiveFiles(logger, args);
ADO2024.PI1.Helper20240305.ArchiveFiles(logger, args);
else if (args[1] == "Day-Helper-2024-04-03")
Day.Helper20240403.AlertIfNewDeviceIsConnected(logger, args);
ADO2024.PI1.Helper20240403.AlertIfNewDeviceIsConnected(logger, args);
else if (args[1] == "Day-Helper-2024-04-04")
Day.Helper20240404.ParseCSV(logger, args);
ADO2024.PI1.Helper20240404.ParseCSV(logger, args);
else if (args[1] == "Day-Helper-2024-04-09")
Day.Helper20240409.MonA(logger, args);
ADO2024.PI1.Helper20240409.MonA(logger, args);
else if (args[1] == "Day-Helper-2024-04-17")
Day.Helper20240417.FilteredRunCommand(logger, args, cancellationToken);
ADO2024.PI1.Helper20240417.FilteredRunCommand(logger, args, cancellationToken);
else if (args[1] == "Day-Helper-2024-04-26")
Day.Helper20240426.UpdateTests(logger, args);
ADO2024.PI1.Helper20240426.UpdateTests(logger, args);
else if (args[1] == "Day-Helper-2024-04-27")
Day.Helper20240427.Immich(appSettings, logger, args);
ADO2024.PI1.Helper20240427.Immich(appSettings, logger, args);
else if (args[1] == "Day-Helper-2024-04-29")
Day.Helper20240429.GitConfigCleanUp(logger, args);
ADO2024.PI2.Helper20240429.GitConfigCleanUp(logger, args);
else if (args[1] == "Day-Helper-2024-05-10")
Day.Helper20240510.PullIconsForBLM(logger, args);
ADO2024.PI2.Helper20240510.PullIconsForBLM(logger, args);
else if (args[1] == "Day-Helper-2024-05-13")
Day.Helper20240513.PersonKeyToName(logger, args);
ADO2024.PI2.Helper20240513.PersonKeyToName(logger, args);
else if (args[1] == "Day-Helper-2024-05-17")
Day.Helper20240517.SaveAmazon(logger, args);
ADO2024.PI2.Helper20240517.SaveAmazon(logger, args);
else if (args[1] == "Day-Helper-2024-05-18")
Day.Helper20240518.PersonKeyToImmichImport(logger, args);
ADO2024.PI2.Helper20240518.PersonKeyToImmichImport(logger, args);
else if (args[1] == "Day-Helper-2024-05-19")
Day.Helper20240519.FindReplaceDirectoryName(logger, args);
ADO2024.PI2.Helper20240519.FindReplaceDirectoryName(logger, args);
else if (args[1] == "Day-Helper-2024-05-20")
Day.Helper20240520.IdentifierRename(logger, args);
ADO2024.PI2.Helper20240520.IdentifierRename(logger, args);
else if (args[1] == "Day-Helper-2024-06-23")
Day.Helper20240623.UpdateSubTasksInMarkdownFiles(logger, args);
ADO2024.PI2.Helper20240623.UpdateSubTasksInMarkdownFiles(logger, args);
else if (args[1] == "Day-Helper-2024-06-24")
Day.Helper20240624.MoveUpOneDirectory(logger, args);
ADO2024.PI2.Helper20240624.MoveUpOneDirectory(logger, args);
else if (args[1] == "Day-Helper-2024-07-11")
Day.Helper20240711.GitRemoteRemove(logger, args);
ADO2024.PI2.Helper20240711.GitRemoteRemove(logger, args);
else if (args[1] == "Day-Helper-2024-07-18")
Day.Helper20240718.JsonToMarkdown(logger, args);
else if (args[1] == "Day-Helper-2024-07-24")
Day.Helper20240724.CopyDirectories(logger, args);
ADO2024.PI2.Helper20240718.JsonToMarkdown(logger, args);
else if (args[1] == "Day-Helper-2024-07-28")
Day.Helper20240728.DownloadSslCertificates(logger, args);
ADO2024.PI2.Helper20240728.DownloadSslCertificates(logger, args);
else if (args[1] == "Day-Helper-2024-08-05")
Day.Helper20240805.RenameFiles(logger, args);
ADO2024.PI3.Helper20240805.RenameFiles(logger, args);
else if (args[1] == "Day-Helper-2024-08-06")
ADO2024.PI3.Helper20240806.ArchiveFiles(logger, args);
else if (args[1] == "Day-Helper-2024-08-09")
ADO2024.PI3.Helper20240809.CreateWorkItems(logger, args);
else if (args[1] == "Day-Helper-2024-08-20")
ADO2024.PI3.Helper20240820.MoveFilesWithSleep(logger, args);
else if (args[1] == "Day-Helper-2024-08-22")
ADO2024.PI3.Helper20240822.ParseKanbn(logger, args);
else if (args[1] == "Day-Helper-2024-08-28")
ADO2024.PI3.Helper20240828.MoveWaferCounterToArchive(logger, args);
else if (args[1] == "Day-Helper-2024-08-30")
ADO2024.PI3.Helper20240830.CompareWorkItems(logger, args);
else if (args[1] == "Day-Helper-2024-09-10")
ADO2024.PI3.Helper20240910.MoveFilesToWeekOfYear(logger, args);
else if (args[1] == "Day-Helper-2024-09-11")
ADO2024.PI3.Helper20240911.WriteMarkdown(logger, args);
else if (args[1] == "Day-Helper-2024-09-16")
ADO2024.PI3.Helper20240916.DebugProxyPass(logger, args);
else if (args[1] == "Day-Helper-2024-09-25")
ADO2024.PI3.Helper20240925.DistinctTests(logger, args);
else if (args[1] == "Day-Helper-2024-10-02")
ADO2024.PI3.Helper20241002.ConvertInfinityQSProjectFiles(logger, args);
else if (args[1] == "Day-Helper-2024-10-29")
ADO2024.PI3.Helper20241029.GetFibonacci(logger, args);
else if (args[1] == "Day-Helper-2024-10-30")
ADO2024.PI3.Helper20241030.GetComplete(logger, args);
else if (args[1] == "Day-Helper-2024-10-31")
ADO2024.PI3.Helper20241031.GetComplete(logger, args);
else if (args[1] == "Day-Helper-2024-11-08")
ADO2024.PI4.Helper20241108.WriteMarkdown(logger, args);
else if (args[1] == "Day-Helper-2024-11-15")
ADO2024.PI4.Helper20241115.GetComplete(logger, args);
else if (args[1] == "Day-Helper-2024-12-04")
ADO2024.PI4.Helper20241204.ConvertToUTF8(logger, args);
else if (args[1] == "Day-Helper-2024-12-12")
ADO2024.PI4.Helper20241212.Rename(logger, args);
else if (args[1] == "Day-Helper-2024-12-17")
ADO2024.PI4.Helper20241217.Backup(logger, args);
else if (args[1] == "Day-Helper-2024-12-24")
ADO2024.PI4.Helper20241224.Compare(logger, args);
else if (args[1] == "Day-Helper-2025-01-01")
ADO2025.PI4.Helper20250101.MoveToDelete(logger, args);
else if (args[1] == "Day-Helper-2025-01-14")
ADO2025.PI4.Helper20250114.Rename(logger, args);
else if (args[1] == "Day-Helper-2025-01-26")
ADO2025.PI4.Helper20250126.Move(logger, args);
else if (args[1] == "Day-Helper-2025-02-04")
ADO2025.PI4.Helper20250204.ExtractKanban(logger, args);
else if (args[1] == "Day-Helper-2025-02-18")
ADO2025.PI5.Helper20250218.MoveToArchive(logger, args);
else if (args[1] == "Day-Helper-2025-02-19")
ADO2025.PI5.Helper20250219.Compare(logger, args);
else if (args[1] == "Day-Helper-2025-02-28")
ADO2025.PI5.Helper20250228.PostgresDumpToJson(logger, args);
else if (args[1] == "Day-Helper-2025-03-01")
ADO2025.PI5.Helper20250301.PocketBaseImportWithDeno(logger, args);
else if (args[1] == "Day-Helper-2025-03-05")
ADO2025.PI5.Helper20250305.WriteNginxFileSystemDelta(logger, args);
else if (args[1] == "Day-Helper-2025-03-06")
ADO2025.PI5.Helper20250306.ProcessDataStandardFormatToJson(logger, args);
else if (args[1] == "Day-Helper-2025-03-15")
ADO2025.PI5.Helper20250315.Empty(logger, args);
else if (args[1] == "Day-Helper-2025-03-20")
ADO2025.PI5.Helper20250320.SortCodeMethods(logger, args, cancellationToken);
else if (args[1] == "Day-Helper-2025-03-21")
ADO2025.PI5.Helper20250321.MoveToLast(logger, args);
else if (args[1] == "Day-Helper-2025-04-04")
ADO2025.PI5.Helper20250404.KumaToGatus(logger, args);
else if (args[1] == "Day-Helper-2025-04-07")
ADO2025.PI5.Helper20250407.Sync(logger, args);
else if (args[1] == "Day-Helper-2025-04-21")
ADO2025.PI5.Helper20250421.FreeFileSyncChangeCreatedDate(logger, args);
else if (args[1] == "Day-Helper-2025-04-29")
ADO2025.PI5.Helper20250429.WriteNginxFileSystem(logger, args);
else if (args[1] == "Day-Helper-2025-05-05")
ADO2025.PI5.Helper20250505.HyperTextMarkupLanguageToPortableDocumentFormat(logger, args);
else if (args[1] == "Day-Helper-2025-05-19")
ADO2025.PI6.Helper20250519.LiveSync(logger, args);
else if (args[1] == "Day-Helper-2025-05-21")
ADO2025.PI6.Helper20250521.MatchDirectory(logger, args);
else if (args[1] == "Day-Helper-2025-06-01")
ADO2025.PI6.Helper20250601.EquipmentAutomationFrameworkStatus(logger, args);
else if (args[1] == "Day-Helper-2025-06-02")
ADO2025.PI6.Helper20250602.EquipmentAutomationFrameworkCellInstanceStateImageVerbIf(logger, args);
else if (args[1] == "Day-Helper-2025-06-18")
ADO2025.PI6.Helper20250618.MoveAllButXOfEach(logger, args);
else
throw new Exception(appSettings.Company);
}

Some files were not shown because too many files have changed in this diff Show More