To me, you do not need to upload an second dummy .p8 file.
At this point, if the connector requires both:
then the real issue is the current connector design, not your implementation.
From what you described, you already tested the main safe workarounds:
reading the file again, duplicating it, and refilling it with StringToFile. If all of those still fail because the stream is consumed or the key format is altered, then there is no clean no-change workaround left in pure microflow logic.
So architecturally, I would say there are only two proper options:
Using a second dummy file would only be a workaround, and a fragile one. I would not consider that a good longterm design.
So my view is: this is no longer a usage issue, but a connector limitation. If you want to keep the Marketplace module untouched, the safer path is to move the token/key handling into your own custom logic. Otherwise, the connector itself needs to be fixed so it caches the parsed key/value instead of reconsuming the FileDocument.
The most likely root cause is that the private key file is being consumed too early.
In your main microflow you use String from file to generate the first token. This reads the entire file and effectively empties the file stream. When Snowflake returns the first page and the connector tries to fetch the next page, it needs to generate a new token again. At that point the module attempts to read the same Private Key object, but since the stream was already consumed, it reads 0 bytes. This results in errors like Unable to decode key or the DecryptPrivateKey null exception.
This also explains why the issue only happens when the result set is larger than one page (>200 rows). If the query returns fewer rows, only one token is generated and everything works.
Recommended fix:
Do not read the private key file in the main microflow before calling the Snowflake module. Either let the connector read the key itself, or read it once and store the value in a reusable variable/object instead of trying to read the same file document again.
In short: avoid reusing an already-consumed file stream.
If this resolves your issue, please mark this answer as accepted so it can help others facing the same problem.