Artificial intelligence (AI) is rapidly transforming financial markets by bringing “speed and innovation” to the sector, according to Eliza Stasopoulou, senior officer at the Cyprus Stock Exchange.
However, Stasopoulou also cautioned that its successful use depends on having “strong regulations, ethical safeguards, and human oversight” to ensure technology serves the public good.
She underlined that “the balance between technology and human judgment is considered essential” for creating a more “transparent, fair, and efficient financial environment.”
One of AI’s greatest advantages, she explained, is “speed,” with computers able to process huge amounts of data in minimal time. This means that multiple pieces of market information can be analysed far quicker than any human.
In the past, analysts, investors and market participants read news, financial analyses, and results before making decisions.
Now, as she pointed out, AI algorithms are able to do the same, but “almost instantly.”
There are already systems known as “algorithmic trading,” which she described as “smart” systems. Algorithmic trading involves the use of algorithms to create and execute buy and sell orders in financial markets.

These algorithms analyse market data and execute transactions based on specific rules and conditions set by the user, “without human intervention.”
With the help of machine learning and deep learning, programs aim to become, to a very large extent, “perfect,” Stasopoulou said.
Machine learning allows computers to “learn” from data and past market movements, improving without being explicitly programmed.
She added that deep learning, a more specialised form using artificial neural networks inspired by the human brain, enables computers to “recognise patterns and make decisions, even in complex problems.”
Artificial intelligence, she noted, goes beyond numbers and attempts to “analyse emotions” by reading public sentiment through social media and news, perceiving both positive and negative reviews.
Based on this, systems are being developed that adapt their strategy and are later taken over by “robo-advisors” to create investment plans tailored to user data such as age, income and goals.
However, Stasopoulou warned that AI is “not a panacea,” as this revolutionary technology comes with “serious and unpredictable risks.”
These range from “job losses, personal data breaches and an increased risk of cyberattacks,” to significant human dependence on machines and the gradual erosion of human skills.
A particularly concerning risk, she highlighted, is the so-called “flash crashes,” sudden, inexplicable and massive market changes triggered by autonomous algorithms reacting to similar stimuli, including fake news or conspiracy theories.
She cautioned that reliance on technology and algorithms “without sufficient human oversight” could destabilise institutions on a global scale.
Moreover, the risk of the “black box” phenomenon, where highly complex algorithms lack “transparency and ethical checks,” could result in social and legal consequences.
With developments unfolding so quickly, governments and regulators will need to “adapt rapidly.”
Yet, as she pointed out, “laws have not kept pace with technology,” creating legal loopholes and potential instability in markets and industries.
For this reason, she called for “international cooperation” to create rules ensuring that artificial intelligence operates “ethically” and avoids “uncontrolled self-replication.”
The use of AI in markets, she said, “raises many questions,” as there can be unexpected behaviours when systems try to overcome obstacles.
Artificial intelligence is indeed revolutionising the way the world functions – including stock markets. But while the future appears promising, it also carries significant dangers.
At this stage, Stasopoulou stressed, any use of AI must “remain under strict control.”
Finally, the CSE official emphasised that “the challenge is not simply to follow the path of technology, but to shape it in a way that benefits everyone.”
Click here to change your cookie preferences