We present a method for finding insights from personal lifelogs. First, we create minute-wise annotation of the users’ activities with respect to the given topics (e.g. socialize, exercise, etc.). This is achieved by performing image retrieval using deep learning, followed by the fusion of multi-modality sensory data. Second, we generate insights of users’ activities that include facts of activity occurrence, temporal and spatial patterns, associations among multiple activities, etc. Finally, we build a prototype mobile app to visualize the insights according to selected themes. We discuss challenges in the process, and opportunities for research and applications in lifelog information access.