We provide a novel approach to perform fully automated generation of restorations for fractured shapes using learned implicit shape representations in the form of occupancy functions. Our approach lays the groundwork to perform automated object repair via additive manufacturing. Existing approaches for restoration of fractured shapes either require prior knowledge of object structure such as symmetries between the restoration and the fractured object, or predict restorations as voxel outputs that are impractical for repair at current resolutions. By leveraging learned occupancy functions for restoration prediction, our approach overcomes the curse of dimensionality with voxel approaches, while providing plausible restorations. Given a fractured shape, we fit a function to occupancy samples from the shape to infer a latent code. We apply a learned transformation to the fractured shape code to predict a corresponding code for restoration generation. To ensure physical validity and well-constrained shape estimation, we contribute a loss that models feasible occupancy values for fractured shapes, restorations, and complete shapes obtained by joining fractured and restoration shapes. Our work overcomes deficiencies of shape completion approaches adapted for repair, and enables consumer-driven object repair and cultural heritage object restoration. We share our code and a synthetic dataset of fractured meshes from 8 ShapeNet classes at: https://github.com/Terascale-All-sensing-Research-Studio/MendNet.