You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 31 Next »

Companies

  • LinkedIn - Apache Kafka is used at LinkedIn for activity stream data and operational metrics. This powers various products like LinkedIn Newsfeed, LinkedIn Today in addition to our offline analytics systems like Hadoop.
  • Tumblr - See this
  • Mate1.com Inc. - Apache kafka is used at Mate1 as our main event bus that powers our news and activity feeds, automated review systems, and will soon power real time notifications and log distribution.
  • Tagged - Apache Kafka drives our new pub sub system which delivers real-time events for users in our latest game - Deckadence. It will soon be used in a host of new use cases including group chat and back end stats and log collection.
  • Boundary - Apache Kafka aggregates high-flow message streams into a unified distributed pubsub service, brokering the data for other internal systems as part of Boundary's real-time network analytics infrastructure.
  • DataSift - Apache Kafka is used at DataSift as a collector of monitoring events and to track user's consumption of data streams in real time. http://highscalability.com/blog/2011/11/29/datasift-architecture-realtime-datamining-at-120000-tweets-p.html
  • Wooga - We use Kafka to aggregate and process tracking data from all our facebook games (which are hosted at various providers) in a central location.
  • AddThis - Apache Kafka is used at AddThis to collect events generated by our data network and broker that data to our analytics clusters and real-time web analytics platform.
  • Urban Airship - At Urban Airship we use Kafka to buffer incoming data points from mobile devices for processing by our analytics infrastructure.
  • Metamarkets - We use Kafka to collect realtime event data from clients, as well as our own internal service metrics, that feed our interactive analytics dashboards.
  • SocialTwist - We use Kafka internally as part of our reliable email queueing system.
  • Countandra - We use a hierarchical distributed counting engine, uses Kafka as a primary speedy interface as well as routing events for cascading counting
  • FlyHajj.com - We use Kafka to collect all metrics and events generated by the users of the website.
  • Twitter - As part of their Storm stream processing infrastructure, e.g. this.
  • uSwitch - See this blog.
  • InfoChimps - Kafka is part of the InfoChimps real-time data platform.
  • uSwitch - See this blog.
  • Visual Revenue - We use Kafka as a distributed queue in front of our web traffic stream processing infrastructure (Storm).
  • Oolya - Kafka is used as the primary high speed message queue to power Storm and our real-time analytics/event ingestion pipelines.
  • Foursquare - Kafka powers online to online messaging, and online to offline messaging at Foursquare. We integrate with monitoring, production systems, and our offline infrastructure, including hadoop.
  • Datadog - Kafka brokers data to most systems in our metrics and events ingestion pipeline. Different modules contribute and consume data from it, for streaming CEP (homegrown), persistence (at different "temperatures" in Redis, ElasticSearch, Cassandra, S3), or batch analysis (Hadoop).
  • VisualDNA We use Kafka 1. as an infrastructure that helps us bring continuously the tracking events from various datacenters into our central hadoop cluster for offline processing, 2. as a propagation path for data integration, 3. as a real-time platform for future inference and recommendation engines
  • Sematext - in SPM (performance monitoring), Kafka is used for performance metrics collection and feeds SPM's in-memory data aggregation (OLAP cube creation) as well as our CEP/Alerts servers. In SA (search analytics) Kafka is used in search stream collection before being aggregated and persisted in HBase.
  • No labels