Accurate breast cancer detection using automated algorithms remains a problem within the literature. Although a plethora of work has tried to address this issue, an exact solution is yet to be found. This problem is further exacerbated by the fact that most of the existing datasets are imbalanced, i.e. the number of instances of a particular class far exceeds that of the others. In this paper, we propose a framework based on the notion of transfer learning to address this issue and focus our efforts on histopathological and imbalanced image classification. We use the popular VGG-19 as the base model and complement it with several state-of-the-art techniques to improve the overall performance of the system. With the ImageNet dataset taken as the source domain, we apply the learned knowledge in the target domain consisting of histopathological images. With experimentation performed on a large-scale dataset consisting of 277,524 images, we show that the framework proposed in this paper gives superior performance than those available in the existing literature. Through numerical simulations conducted on a supercomputer, we also present guidelines for work in transfer learning and imbalanced image classification.