In high-resolution solar physics, the volume and complexity of photometric, spectroscopic, and polarimetric ground-based data significantly increased in the last decade reaching data acquisition rates of terabytes per hour. This is driven by the desire to capture fast processes on the Sun and by the necessity for short exposure times "freezing" the atmospheric seeing, thus enabling post-facto image restoration. Consequently, large-format and high-cadence detectors are nowadays used in solar observations to facilitate image restoration. Based on our experience during the "early science" phase with the 1.5-meter GREGOR solar telescope (2014-2015) and the subsequent transition to routine observations in 2016, we describe data collection and data management tailored towards image restoration and imaging spectroscopy. We outline our approaches regarding data processing, analysis, and archiving for two of GREGOR's post-focus instruments (see http://gregor.aip.de), i.e., the GREGOR Fabry-Perot Interferometer (GFPI) and the newly installed High-Resolution Fast Imager (HiFI). The heterogeneous and complex nature of multi-dimensional data arising from high-resolution solar observations provides an intriguing but also a challenging example for "big data" in astronomy. The big data challenge has two aspects: (1) establishing a workflow for publishing the data for the whole community and beyond and (2) creating a Collaborative Research Environment (CRE), where computationally intense data and post-processing tools are co-located and collaborative work is enabled for scientists of multiple institutes. This requires either collaboration with a data center or frameworks and databases capable of dealing with huge data sets based on Virtual Observatory (VO) and other community standards and procedures.