Trying to develop Powershell code to process a very large file. One record processed by same block of code over and over.
If the file has 100,000 records, it would be ideal to use start-job for the file chopped up in 50 or however many chunks of records (memory and CPU power restricted) and use start-job to process the file faster than using get-content one record at a time.
Anyone have any examples they care to share?