Tremendous progress has been witnessed in artificial intelligence within the domain of neural network backed deep learning systems and its applications. As we approach the post Moore’s Law era, the limit of semiconductor fabrication technology along with a rapid increase in data generation rates have lead to an impending growing challenge of tackling newer and more modern machine learning problems. In parallel, quantum computing has exhibited rapid development in recent years. Due to the potential of a quantum speedup, quantum based learning applications have become an area of significant interest, in hopes that we can leverage quantum systems to solve classical problems. In this work, we propose a quantum deep learning architecture; we demonstrate our quantum neural network architecture on tasks ranging from binary and multi-class classification to generative modelling. Powered by a modified quantum differentiation function along with a hybrid quantum-classic design, our architecture encodes the data with a reduced number of qubits and generates a quantum circuit, loading it onto a quantum platform where the model learns the optimal states iteratively. We conduct intensive experiments on both the local computing environment and IBM-Q quantum platform. The evaluation results demonstrate that our architecture is able to outperform Tensorflow-Quantum by up to 12.51% and 11.71% for a comparable classic deep neural network on the task of classification trained with the same network settings. Furthermore, our GAN architecture runs the discriminator and the generator purely on quantum hardware and utilizes the swap test on qubits to calculate the values of loss functions. In comparing our quantum GAN, we note our architecture is able to achieve similar performance with 98.5% reduction on the parameter set when compared to classical GANs. With the same number of parameters, additionally, QuGAN outperforms other quantum based GANs in the literature for up to 125.0% in terms of similarity between generated distributions and original data sets.