We developed an asynchronous brain-machine interface (BMI)-based lower limb exoskeleton control system based on steady-state visual evoked potentials(SSVEPs). By decoding electroencephalography (EEG) signals in real-time, usersare enabled to walk forward, turn right, turn left, sit, and stand while wearing exoskeleton.SSVEP stimulation is implemented with a visual stimulation unit, consisting of five lightemitting diodes (LEDs) fixated to the exoskeleton. A canonical correlation analysis (CCA)method for the extraction of frequency information associated with the SSVEP was usedin combine with k-nearest neighbors (KNN). Overall, 11 healthy subjects participated in the experiment to evaluate performance. For achieving the best classification, CCA was first calibrated within an offline experiment. In the subsequent online experiment, our results exhibit accuracies of 91.3±5.73%, response time of 3.28±1.82 s, information transfer rate (ITR) of 32.9±9.13 bits/min, and completion time of 1100±154.92 s for the experimental parcour studied. The ability to achieve such high quality BMI control indicates that SSVEP-based lower limb exoskeleton for gait assist is becoming feasible.