I still would not optimize this, unless you are certain that it is unmaintainable. The chance that the added complexity causes bugs and performance penalties are bigger than the chance you really improve anything by changing things there. Or, as the old IT saying goes: "premature optimization is the root of all evil". A well indexed server should be able to handle millions of records.
“We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%. A good programmer will not be lulled into complacency by such reasoning, he will be wise to look carefully at the critical code; but only after that code has been identified”[5] - Donald Knuth
With a heavy server, enough memory, correct indexes, optimized retrieves in microflows (don't loop thru all persons) and optimized queries a database server can handle millions of records.
If this still is not the case you can consider splitting up the person entity into person and archiveperson and create the same associations for both.
Another option is to create an inherited entity (archiveperson) with the base entity (person) having all associations.