We introduce a formal framework to study the time and space complexity of computing with faulty memory. For the fault-free case, time and space complexities were studied using the "pebbling game" model. We extend this model to the faulty case, where the content of memory cells may be erased. The model captures notions such as "check points" (keeping multiple copies of intermediate results), and "recovery" (partial recomputing in the case of failure). Using this model, we derive tight bounds on the time and/or space overhead in icted by faults. As a lower bound, we exhibit cases where f worst-case faults may necessitate an (f) multiplicative factor overhead in computation resources (time, space, or their product). The lower bound holds regardless of the computing and recomputing strategy employed. A matching upperbound algorithm establishes that an O(f) multiplicative overhead always su ces. For the special class of binary tree computations, we show that f...