Communication is critical to collaboration; however, too much of it can degrade performance. Motivated by the need for effective use of a robot’s communication modalities, in this work, we present a computational framework that decides if, when, and what to communicate during human-robot collaboration. The framework, titled CommPlan, consists of a model specification process and an execution-time POMDP planner. To address the challenge of collecting interaction data, the model specification process is hybrid: where part of the model is learned from data, while the remainder is manually specified. Given the model, the robot’s decision-making is performed computationally during interaction and under partial observability of human’s mental states. We implement CommPlan for a shared workspace task, in which the robot has multiple communication options and needs to reason within a short time. Through experiments with human participants, we confirm that CommPlan results in the effective use of communication capabilities and improves human-robot collaboration.