In order for parallel logic programming systems to become popular, they should serve the broadest range of applications. To achieve this goal, designers of parallel logic programming systems would like to exploit maximum parallelism for existing and novel applications, ideally by supporting both and-parallelism and or-parallelism. Unfortunately, the combination of both forms of parallelism is a hard problem, and available proposals cannot match the efficiency of, say, or-parallel only systems. We propose a novel approach to And/Or Parallelism in logic programs. Our initial observation is that stack copying, the most popular technique in or-parallel systems, does not work well with And/Or systems because memory management is much more complex. Copying is also a significant problem in operating systems where the copy-onwrite (COW) has been developed to address the problem. We demonstrate that this technique can also be applied to And/Or systems, and present both shared memory and distri...