Tag: technology

  • The rise and fall of Higgsfield in 30 days

    How Higgsfield AI went from $1.3B valuation to X suspension, and what it reveals about AI’s creator problem


    On January 15, 2026, Higgsfield AI announced a $130 million Series A funding round. The San Francisco startup had reached unicorn status with a $1.3 billion valuation. TechCrunch ran the headline. Forbes noted the founders. Investors cheered.

    Twenty-five days later, the company’s X account with almost 200,000 followers was suspended.

    The rapid fall from industry darling to platform pariah wasn’t a random accident or a technical glitch. It was the inevitable collision between a company that treated creators as disposable and a creator community that refused to stay quiet about it.

    This is the story of what happens when a well-funded AI startup forgets that the people it relies on are, well, people.


    The unicorn that bragged about ending jobs

    Higgsfield AI positioned itself as a democratizer of video creation. Founded by Alex Mashrabov, a former Snap executive and Forbes 30 Under 30 honoree, alongside co-founder Yerzat Dulat, the company promised to let anyone generate professional-quality videos using artificial intelligence.

    The pitch worked. Accel, Menlo Ventures, and AI Capital Partners led the funding round. The company claimed $200 million in annual recurring revenue and 15 million users generating 4.5 million videos daily. By the numbers, Higgsfield looked like the real deal.

    But numbers don’t tell the whole story. While Higgsfield presented itself as a cutting-edge AI company building the future of video, the reality was messier. Rather than developing foundation models, the company largely aggregated third-party AI APIs, wrapping them in a consumer-friendly interface. That’s not inherently wrong, but it meant Higgsfield’s competitive advantage wasn’t proprietary technology. It was marketing and growth at all costs.

    Then came February 5. Higgsfield posted on X about “ending 20+ creative jobs” with its AI tools. The company seemed proud of this message. It was a celebration of displacement.

    The backlash was immediate and ferocious.

    Celebrating the end of artists’ careers (even when their motion design tools are not actually good enough to render anyone unemployed) is just super dumb and shortsighted. โ€” Aharon Rabinowitz, CEO of Motion Management

    The post revealed something uncomfortable about Higgsfield’s attitude toward the creative professionals whose work had helped build its brand. But that attitude wasn’t new. It had been there all along, visible to anyone who looked closely enough.


    The creators left unpaid

    Multiple creators have come forward with similar stories. Higgsfield would reach out, praise their portfolios, and offer paid collaborations. The work would get delivered and deployed across Higgsfield’s marketing channels. Then the invoices would go unanswered. Follow-up emails were ignored. Payment deadlines passed.

    Leo Kadieff, a creative professional who documented the issues, wrote on LinkedIn about “leaving a dozen creatives unpaid” in the company’s sponsored content programs. Other creators echoed similar experiences in comments and community forums. The pattern was consistent: promising collaborations that turned into payment disputes.

    “I know a great bunch of creatives that never were paid in former campaigns,” wrote one commenter on Kadieff’s post. Another described being dropped from the program after creating content, then asked to produce additional work for free.

    One creator took to Reddit with a post titled “After 50+ collabs I got scammed by Higgsfield AI.” They described being offered a “sponsored opportunity” that turned into an unpaid nightmare. When they finally pushed for payment, Higgsfield refused, citing “suspicious activity” on their account without providing any evidence.

    For a company that had just raised nine figures, the unpaid invoices shouldn’t have been a problem. But the issue wasn’t cash flow. It was culture.


    The allegations pile up

    Non-payment was just the beginning. As the creator community began comparing notes, a fuller picture of Higgsfield’s practices emerged.

    Deceptive marketing: The company advertised “unlimited” subscription plans that were quietly throttled after users hit undisclosed limits. As one user, Ian Hudson, put it: “Either something’s unlimited or it’s not, but it turns out it’s not unlimited.” The gap between promise and reality left paying customers feeling misled.

    Undisclosed paid promotions: Higgsfield ran influencer campaigns featuring creators who were supposed to be demonstrating the AI’s capabilities. But the assets in those videos weren’t AI-generated. They were pre-made by human designers, then presented as evidence of what Higgsfield’s AI could do. The paid nature of these partnerships wasn’t disclosed.

    Intellectual property concerns: Artist Ade, an animator, publicly accused Higgsfield of copying her work to sell their generative AI services. “This is a high profile ‘unicorn’ company that tried to sell their generative AI video services by copying my artwork.” The company didn’t respond to her allegations publicly.

    Individually, each issue might have been manageable. Together, they painted a picture of a company that saw creators as resources to be extracted, not partners to be respected.


    The suspension that shocked no one

    On February 9, 2026, Higgsfield’s X account vanished. One day it had almost 200,000 followers and was posting about revolutionising video creation. The next, visitors saw a generic message: “Account suspended.”

    X doesn’t typically comment on individual suspensions, so the exact trigger remains unclear. But the timing was telling. The suspension came shortly after Higgsfield’s “ending creative jobs” post sparked widespread outrage. It followed a surge of reports from creators sharing their experiences. The platform’s algorithms may have detected coordinated activity, or the company may have violated specific terms of service.

    The most likely explanation is that X received a flood of reports about Higgsfield’s behaviour and determined the account had violated community standards. Mass reporting alone doesn’t guarantee suspension. There has to be underlying policy violation. But when multiple users document non-payment, deceptive practices, and intellectual property concerns, platforms take notice.

    For the creators still waiting for payment, the suspension was both vindicating and frustrating. Vindicating because it showed their voices had been heard. Frustrating because it didn’t get them their money back.


    The CEO’s carefully worded non-apology

    On February 11, two days after the suspension, Alex Mashrabov finally broke his silence. His statement acknowledged the situation without fully addressing it.

    Mashrabov claimed the X suspension was the result of “coordinated mass reporting” and insisted it didn’t reflect any actual wrongdoing. He promised that outstanding invoices would be paid and attributed the payment delays to administrative issues.

    The statement had gaps. It didn’t explain why the payment issues had persisted for months. It didn’t address the deceptive marketing allegations. It didn’t respond to the intellectual property concerns raised by Artist Ade. And it didn’t grapple with the core issue that had angered so many people: the company’s apparent attitude toward creators.

    A commitment to pay outstanding invoices is better than silence. But for creators who had been ghosted for months, words felt cheap. The real test would be whether the money actually arrived.


    The broader pattern

    Higgsfield isn’t the first AI company to clash with creators, and it won’t be the last. The tension between AI development and creative labour is structural, not incidental.

    AI companies need training data, example content, and human-generated material to both train their models and market their capabilities. That means they need creators. But the economics of AI tend to treat creative work as a commodity to be minimised rather than a skill to be valued.

    We’ve seen this pattern before. AI image generators trained on artists’ work without consent or compensation. AI writing tools built on scraped content. AI voice tools that can mimic performers without their permission. Each time, the same dynamic plays out: a well-funded company extracts value from creators, then celebrates the efficiency of the extraction.

    Higgsfield just made the pattern more visible because it happened so fast and so publicly. Within 30 days, the company went from industry success story to cautionary tale.

    The timing was particularly stark. On February 10, 2026, the day after Higgsfield’s suspension, competitor Runway announced it had raised $315 million at a $5.3 billion valuation. Two companies in the same space, two very different trajectories. Runway has built its own foundation models and maintained generally positive relationships with the creative community. Higgsfield took shortcuts.


    What’s at stake

    The Higgsfield story matters beyond one company’s reputation. It’s a test case for how the AI industry treats the creative workers it depends on.

    If companies can raise massive funding rounds while routinely underpaying or failing to pay creators, the message is clear: creative labour is expendable. If platforms like X will suspend accounts only after sustained public pressure, the message is that bad behaviour is tolerated until it becomes a PR problem.

    But there’s a more hopeful reading. Higgsfield’s rapid downfall suggests that creator communities can hold powerful companies accountable. The X suspension didn’t happen because of regulatory action or investor intervention. It happened because creators organised, shared their stories, and refused to be ignored.

    This is what happens when a community decides it’s had enough. One suspension won’t fix the industry, but it shows that collective action works.

    For freelancers and independent creators, the lesson is sobering but important. Well-funded companies can still be bad actors. Contracts matter. Documentation matters. Community matters.

    For the AI industry, the lesson should be clearer still. You can’t build the future of creativity while treating creatives as disposable. Sooner or later, the people you’re burning will burn you back.

    I write about AI’s impact on creative work regularly. If this kind of accountability journalism is useful to you, consider subscribing so you don’t miss the next one.


    References