Facebook chief Mark Zuckerberg has been under intense pressure since March, when a white supremacist gunman used Facebook Live to stream his rampage at two mosques in the New Zealand city of Christchurch, killing 51 people.
The California-based network said it would ban Facebook Live users who shared extremist content and seek to reinforce its own internal controls to stop the spread of offensive videos.
"Following the horrific recent terrorist attacks in New Zealand, we've been reviewing what more we can do to limit our services from being used to cause harm or spread hate," Facebook vice-president of integrity Guy Rosen said in a statement.
"There is a lot more work to do, but I am pleased Facebook has taken additional steps today," Ardern said in a statement.
She and Macron will later issue their Christchurch Call to fight the spread of hateful and terror-related content, along with leaders from Britain, Canada, Norway, Jordan and Senegal, who will also be in Paris.
The largely symbolic initiative is intended to keep up the pressure on social media companies who face growing calls from politicians across the world to prevent their platforms from becoming stages for broadcasting extremist violence.
"We need to get in front of this (problem) before harm is done," Ardern told CNN in an interview on Wednesday. "This is not just about regulation, but bringing companies to the table and saying they have a role too."
Many countries have already tightened legislation to introduce penalties for companies that fail to take down offensive content once it is flagged, by either users or the authorities.
But analysts say the tighter controls pledged on Wednesday will go only so far in preventing people from circumventing rules and policies already in place against disseminating violence and hate speech.
"You can't prevent content from being uploaded: it would require the resources for tracking everything put online by all internet users," said Marc Rees, editor in chief of the technology site Next INpact.
"Can you imagine trying to get TV or radio to prevent libellous, abusive or violent speech that someone might say?" he asked.
The political meeting in Paris will run in parallel to an initiative launched by Macron called "Tech for Good" which will bring together 80 tech executives to discuss how to harness technologies for the common good.
Top officials from US tech giants Wikipedia, Uber, Twitter, Microsoft and Google will attend, but not Zuckerberg, who met privately with Macron last week.
The US government has not endorsed the Christchurch Call and will be represented only at a junior level at a meeting of G7 digital ministers which is also taking place Wednesday in Paris.
In an opinion piece in The New York Times over the weekend, Ardern said the Christchurch massacre underlined "a horrifying new trend" in extremist atrocities.
Ardern said Facebook removed 1.5 million copies of the video within 24 hours of the attack, but she still found herself among those who inadvertently saw the footage when it auto-played on their social media feeds.
Around 8,000 New Zealanders called a mental health hotline after seeing the video, she told CNN.
In Wednesday's statement, Facebook acknowledged the inadequacy of its own systems.
"People - not always intentionally - shared edited versions of the video which made it hard for our systems to detect," said vice-president of integrity Rosen said.
New Zealand officials said Ardern found a natural partner for the fight against online extremism in Macron, who has repeatedly stated that the status quo is unacceptable.
"Macron was one of the first leaders to call the prime minister after the attack, and he has long made removing hateful online content a priority," New Zealand's ambassador to France, Jane Coombs, told journalists on Monday.
"It's a global problem that requires a global response," she said. (AFP)