(Disclaimer: I am a transhumanist skeptic these days, not to mention a singularity curmudgeon and a critic of Mars colonization, but I still find these ideas nice to chew on sometimes.)
Humans are social animals, and it seems reasonable to assume that any transhuman condition we can wrap our minds around will also be a social one for most of its participants.
Society implies a social contract, that is: we grant one another rights and in return make the concession of respecting each others' rights, in order that our own rights be observed and respected.
And violations of rights tend to be at the root of our concept of crime and injustice—at least, any modern concept of crime once we discard religious justifications and start trying to figure things out from first principles.
Which leads me to ask: in a transhumanist society—go read Accelerando, or Glasshouse, or The Rapture of the Nerds—what currently recognized crimes need to be re-evaluated because their social impact has changed? And what strange new crimes might universally be recognized by a society with, for example, mind uploading, strong AI, or near-immortality?
SF authors are paid to think our way around the outside of ideas, so it's always worth raiding the used fiction bin for side-effects and consequences. Here's qntm's take on the early years of mind uploading--the process of digitizing the connectome of a human brain in order to treat it as software: I strongly suggest you read Lena (if you haven't previously done so) before continuing. It's a short story, structured as a Wikipedia monograph, and absolutely horrifying by implication, for various reasons.
Let me give you that link again: Lena. (Go read: it's short, good fiction, and the rest of this essay will still be here when you get back.)
Mind uploading makes certain assumptions. (Notably: mind/body dualism is a bust, there is no supernatural element to consciousness, also that we can resolve the structures involved in neurological information processing with sufficient resolution to be useful, and that the connectivity and training of the weighted neural network in the wetware is what consciousness emerges from.)
Uploading also implies that consciousness is replicable and fungible, which in turn implies our legal systems can't cope without extensive modification because we rely on an implicit definition of humanity which at that point will be obsolete, as the treatment of MMAcevedo (Mnemonic Map/Acevedo), aka "Miguel" in the story, demonstrates: MMAcevedo is considered by some to be the "first immortal", and by others to be a profound warning of the horrors of immortality.
Historically, our identity has been linear: there is a start, there is a terminus, along the way we are indivisible, although we undergo change over time (and may lose or gain significant portions of our selves—for example, most people retain few or no memories of their life before a point some time between the ages of 3 and 5 years old).
The premature termination of a human life is an irrevocable act, and to deliberately inflict it on someone is seen as a crime (various degrees of murder).
Because our identity is indivisible and of limited duration, time is a rivalrous resource to us: we have to choose what to do with it, or be subject to someone else's choices. (One of the reasons why imprisonment is seen as a punishment—to which we are averse—is the total loss of opportunities to choose what to do with the time we lose. (Yes, there are other reasons: let's ignore them and focus on what this might signify for the posthuman condition.)
There's a fascinating sequence early in Linda Nagata's space opera novel Vast that throws the implications of alienated labour for uploaded minds into stark relief: if you're confronted with a mind-numbingly tedious task that needs human-level cognitive supervision for a period of years or decades, why not divide your time up in chunks and discard the boring ones? You could set up a watchdog timer to reset your uploaded mind to a baseline state every 3 minutes, unless an exception occurs—an emergency that makes you hit the dead man's handle in your environment, at which point the subjective passage of time resumes. In Vast, a human mind is needed to supervise a slower-than-light starship on a voyage that takes centuries during which nothing much happens. The crew use this three minute reset cycle to avoid experiencing tedium: subjectively, they condense the entire voyage into 180 seconds. (If you've driven long distance you'll probably have wished for the ability to push a button and find yourself at your destination. Right?)
Other authors found other angles on this question: the first book in Hannu Rajaniemi's Jean Le Flambeur trilogy (The Quantum Thief starts with the exact opposite—a thief sentenced to spend a subjective eternity in an escape-proof prison, as a punishment of sorts. Spoiler: he escapes. How he does it and why he was there is the start of yet more musing on what might constitute crimes in a realm populated entirely by uploaded minds. In particular Rajaniemi dives headlong into two really disturbing questions: firstly, the potential for eternal enslavement such a setting offers (never mind perpetual torment), and secondly, what it does to the post-Enlightenment social concept of human equality.
We are all living in the afterglow of a sociological big bang that took place in 1649—the execution of Charles I, who was variously King of England and Wales, Scotland, and Ireland at the time of the Wars of the Three Kingdoms: his trial and execution by a court--appointed by a parliament of the people—shattered the then-prevalent understanding among European/Christian communities that Kings were appointed by God to rule on Earth. A corollary of the Divine Right of Kings is that some people really aren't equal--monarchs, and by extension, aristocrats, have more rights (by religious decree) than other people, and some categories (chattel slavery springs to mind: also the status of women and children) have less. But if the People could try the King for crimes against the state, then what next?
"What next" turned out to be a troublesome precedent. Charles I's younger son James II tried to walk back the uneasy settlement with parliament and got yeeted into exile in 1688-90 as a result, with the resounding and lasting outcome that the powers of the Crown in English and Scottish law was now vested in Parliament, and the head beneath the fancy hat was merely a figurehead who could be sacked if he (or she) acted up. If the monarch wasn't divinely appointed, what set him apart? Numerous philosophical maunderings later it was the French king's turn, and also time for the US Bill of Rights--which, while based on the 1698 English Bill of Rights, implicitly adopted the pernicious logic that there could be no king, no nobility, only free citizens. (Pay no attention to the slaves—for now.)
Here's the thing: our current prevailing political philosophy of human rights and constitutional democracy is invalidated if we have mind uploading/replication or super-human intelligence. (The latter need not be AI; it could be uploaded human minds able to monopolize sufficient computing substrate to get more thinking done per unit baseline time than actual humans can achieve.) Some people are, once again, clearly superior in capability to born-humans. And other persons can be ruthlessly exploited for their labour output without reward, and without even being allowed to know that they're being exploited. Again, see also the subtext of Ken MacLeod's The Corporation Wars trilogy: in which the war between the neoreactionaries and the post-Enlightenment democrats has been won ... by the wrong side.
The second book in The Quantum Thief trilogy, The Fractal Prince, gives us a ghastly look at a world where genocide and enslavement are carried out by forcibly abducting and uploading the last born-human survivors—is it actually genocide if the body is dead but the mind is still there? (It's a new version of Caedite eos. Novit enim Dominus qui sunt eius, of course.) It may not be genocide in the currently accepted legal sense of the term—the forcible extermination of a cultural group or of the people who are members of such a group—but it's certainly a comparable abomination.
Our intuitions about crimes against people (and humanity) are based on a set of assumptions about the parameters of personhood that are going to be completely destroyed if mind uploading turns out to be possible. And the only people I see doing much thinking about this (in public) are either SF authors or people pushing a crankish ideology based on 19th century Russian orthodox theology.
Surely we can do better?