TikTok and Meta bosses faced questions from students at Pontypridd High School in Rhondda Cynon Taf.
Platforms said they listen to feedback from users and encourage them to make use of safety features.
The event was organised by Alex Davies-Jones MP as the UK government's online safety bill goes through Parliament.
"There's barely a day that goes by that I don't experience something negative", said 17-year-old Caitlin.
"I personally think social media's pretty dangerous."
"Scrolling through TikTok yesterday and there's many challenges aimed at young girls to eat roughly 300-400 calories in a day….[and] it could trigger many eating disorders."
Many of Caitlin's attempts to report harmful content have been unsuccessful.
"I think the reporting process should be looked into a bit more, since it takes so long for things to be taken down, by the time they are taken down, the harm's already been made."
TikTok said it strictly removes content that promotes disordered eating.
Caitlin had to deal with online abuse after posting about Manchester United Footballer Mason Greenwood being arrested on suspicion of rape. He has denied the claim.
"I spoke about the whole Mason Greenwood situation and I got told I need to get beat up about it. "
"I feel like because of social media women's voices are being heard less, I feel like we're being pushed back a bit."
"It's kind of like a war on us in a way. It's like we can't really speak without having harm wanted against us".
Others highlighted positive aspects of social media, such as keeping in touch with friends and self-love campaigns.
"There's a lot of body positive things that go around now that never used to," said 17-year-old Isabelle.
Pupils working on a project to improve online safety were given the chance to quiz tech giants at a virtual event.
"Young people feel it is very difficult to report and remove upsetting content. How can you make this process easier?" asked 13-year-old Brooke.
TikTok's head of safety public policy in Europe, Alexandra Evans, said she thinks the platforms reporting mechanisms are "intuitive" but welcomes feedback from users on how they are struggling.
She also highlighted blocking functions: "For example, if you don't like the word 'hate' or 'loser', whatever it may be, you can set a list of words that you will always get filtered from your comments."
Megan Thomas, public policy associate manager at Meta, which owns Facebook and Instagram, said the company has recently developed new features "designed to help prevent people from having to experience any kind of harmful content on our platforms in the first place."
Poppi, 13, wanted to know how many offensive posts are taken down each day, and what consequences are in place for repeat offenders.
Both representatives said they did not know the daily figure, but pointed to quarterly reports.
"There's a spectrum of harm, there's a spectrum of behaviours and we try to be really specific in our responses", said TikTok's Ms Evans.
"But also when it comes to those egregious cases, when it comes to those absolute zero-tolerance behaviours, we are all working together to make sure that we are responding and stamping out that kind of activity across all of our platforms."
New online safety laws are being introduced by the UK government, but Labour MP for Pontypridd and shadow technology minister Alex Davies-Jones warned of "loopholes" in the legislation,
A UK government spokesperson said: "Our pioneering Online Safety Bill will already deliver major improvements to the safety of women and girls from criminalising cyber flashing to protecting young girls from harmful content."
They added that failure to act by social media companies could result in heavy fines.