Storm-kafka-mongoDB integration2019 Community Moderator ElectionMongoDB or CouchDB - fit for production?MongoDB vs. CassandraHow to query MongoDB with “like”?Delete everything in a MongoDB databaseHow do I drop a MongoDB database from the command line?When to use CouchDB over MongoDB and vice versaHow to integrate Storm and KafkaApache Kafka vs Apache Stormstorm-kafka integration errorApache Kafka and Apache Storm Integration

Is being socially reclusive okay for a graduate student?

3.5% Interest Student Loan or use all of my savings on Tuition?

Problems with rounding giving too many digits

Short story about an infectious indestructible metal bar?

Different Account page layouts, what are they?

What can I do if someone tampers with my SSH public key?

Convert an array of objects to array of the objects' values

What is the purpose of a disclaimer like "this is not legal advice"?

Why are special aircraft used for the carriers in the United States Navy?

Is divide-by-zero a security vulnerability?

What is better: yes / no radio, or simple checkbox?

What is the oldest European royal house?

Under what conditions would I NOT add my Proficiency Bonus to a Spell Attack Roll (or Saving Throw DC)?

The need of reserving one's ability in job interviews

Are Wave equations equivalent to Maxwell equations in free space?

Python 3.6+ function to ask for a multiple-choice answer

Naming Characters after Friends/Family

Can you run a ground wire from stove directly to ground pole in the ground

A bug in Excel? Conditional formatting for marking duplicates also highlights unique value

How to write a chaotic neutral protagonist and prevent my readers from thinking they are evil?

ESPP--any reason not to go all in?

Create chunks from an array

Can a Mimic (container form) actually hold loot?

Is there a math expression equivalent to the conditional ternary operator?



Storm-kafka-mongoDB integration



2019 Community Moderator ElectionMongoDB or CouchDB - fit for production?MongoDB vs. CassandraHow to query MongoDB with “like”?Delete everything in a MongoDB databaseHow do I drop a MongoDB database from the command line?When to use CouchDB over MongoDB and vice versaHow to integrate Storm and KafkaApache Kafka vs Apache Stormstorm-kafka integration errorApache Kafka and Apache Storm Integration










0















I am reading 500 MB random tuples from Kafka producer continuous and in a storm topology I am inserting it to MongoDb using Mongo Java Driver. The problem is I am getting really low throughput as 4-5 tuples per second.



Without DB insert if I write a simple print statement I get throughput as 684 tuples per second. I am planning to run 1Million records from Kafka and check the throughput with mongo insert.



I tried to tune using config setMaxSpoutPending , setMessageTimeoutSecs parms in kafkaconfig.



 final SpoutConfig kafkaConf = new SpoutConfig(zkrHosts, kafkaTopic, zkRoot, clientId);
kafkaConf.ignoreZkOffsets=false;
kafkaConf.useStartOffsetTimeIfOffsetOutOfRange=true;
kafkaConf.startOffsetTime=kafka.api.OffsetRequest.LatestTime();
kafkaConf.stateUpdateIntervalMs=2000;
kafkaConf.scheme = new SchemeAsMultiScheme(new StringScheme());
final TopologyBuilder topologyBuilder = new TopologyBuilder();
topologyBuilder.setSpout("kafka-spout", new KafkaSpout(kafkaConf), 1);
topologyBuilder.setBolt("print-messages", new MyKafkaBolt()).shuffleGrouping("kafka-spout");
Config conf = new Config();
conf.setDebug(true);
conf.setMaxSpoutPending(1000);
conf.setMessageTimeoutSecs(30);


Execute method of bolt



 JSONObject jObj = new JSONObject();
jObj.put("key", input.getString(0));

if (null !=jObj && jObj.size() > 0 ) {
final DBCollection quoteCollection = dbConnect.getConnection().getCollection("stormPoc");
if (quoteCollection != null)
BasicDBObject dbObject = new BasicDBObject();
dbObject.putAll(jObj);
quoteCollection.insert(dbObject);
// logger.info("inserted in Collection !!!");
else
logger.info("Error while inserting data in DB!!!");

collector.ack(input);









share|improve this question




























    0















    I am reading 500 MB random tuples from Kafka producer continuous and in a storm topology I am inserting it to MongoDb using Mongo Java Driver. The problem is I am getting really low throughput as 4-5 tuples per second.



    Without DB insert if I write a simple print statement I get throughput as 684 tuples per second. I am planning to run 1Million records from Kafka and check the throughput with mongo insert.



    I tried to tune using config setMaxSpoutPending , setMessageTimeoutSecs parms in kafkaconfig.



     final SpoutConfig kafkaConf = new SpoutConfig(zkrHosts, kafkaTopic, zkRoot, clientId);
    kafkaConf.ignoreZkOffsets=false;
    kafkaConf.useStartOffsetTimeIfOffsetOutOfRange=true;
    kafkaConf.startOffsetTime=kafka.api.OffsetRequest.LatestTime();
    kafkaConf.stateUpdateIntervalMs=2000;
    kafkaConf.scheme = new SchemeAsMultiScheme(new StringScheme());
    final TopologyBuilder topologyBuilder = new TopologyBuilder();
    topologyBuilder.setSpout("kafka-spout", new KafkaSpout(kafkaConf), 1);
    topologyBuilder.setBolt("print-messages", new MyKafkaBolt()).shuffleGrouping("kafka-spout");
    Config conf = new Config();
    conf.setDebug(true);
    conf.setMaxSpoutPending(1000);
    conf.setMessageTimeoutSecs(30);


    Execute method of bolt



     JSONObject jObj = new JSONObject();
    jObj.put("key", input.getString(0));

    if (null !=jObj && jObj.size() > 0 ) {
    final DBCollection quoteCollection = dbConnect.getConnection().getCollection("stormPoc");
    if (quoteCollection != null)
    BasicDBObject dbObject = new BasicDBObject();
    dbObject.putAll(jObj);
    quoteCollection.insert(dbObject);
    // logger.info("inserted in Collection !!!");
    else
    logger.info("Error while inserting data in DB!!!");

    collector.ack(input);









    share|improve this question


























      0












      0








      0








      I am reading 500 MB random tuples from Kafka producer continuous and in a storm topology I am inserting it to MongoDb using Mongo Java Driver. The problem is I am getting really low throughput as 4-5 tuples per second.



      Without DB insert if I write a simple print statement I get throughput as 684 tuples per second. I am planning to run 1Million records from Kafka and check the throughput with mongo insert.



      I tried to tune using config setMaxSpoutPending , setMessageTimeoutSecs parms in kafkaconfig.



       final SpoutConfig kafkaConf = new SpoutConfig(zkrHosts, kafkaTopic, zkRoot, clientId);
      kafkaConf.ignoreZkOffsets=false;
      kafkaConf.useStartOffsetTimeIfOffsetOutOfRange=true;
      kafkaConf.startOffsetTime=kafka.api.OffsetRequest.LatestTime();
      kafkaConf.stateUpdateIntervalMs=2000;
      kafkaConf.scheme = new SchemeAsMultiScheme(new StringScheme());
      final TopologyBuilder topologyBuilder = new TopologyBuilder();
      topologyBuilder.setSpout("kafka-spout", new KafkaSpout(kafkaConf), 1);
      topologyBuilder.setBolt("print-messages", new MyKafkaBolt()).shuffleGrouping("kafka-spout");
      Config conf = new Config();
      conf.setDebug(true);
      conf.setMaxSpoutPending(1000);
      conf.setMessageTimeoutSecs(30);


      Execute method of bolt



       JSONObject jObj = new JSONObject();
      jObj.put("key", input.getString(0));

      if (null !=jObj && jObj.size() > 0 ) {
      final DBCollection quoteCollection = dbConnect.getConnection().getCollection("stormPoc");
      if (quoteCollection != null)
      BasicDBObject dbObject = new BasicDBObject();
      dbObject.putAll(jObj);
      quoteCollection.insert(dbObject);
      // logger.info("inserted in Collection !!!");
      else
      logger.info("Error while inserting data in DB!!!");

      collector.ack(input);









      share|improve this question
















      I am reading 500 MB random tuples from Kafka producer continuous and in a storm topology I am inserting it to MongoDb using Mongo Java Driver. The problem is I am getting really low throughput as 4-5 tuples per second.



      Without DB insert if I write a simple print statement I get throughput as 684 tuples per second. I am planning to run 1Million records from Kafka and check the throughput with mongo insert.



      I tried to tune using config setMaxSpoutPending , setMessageTimeoutSecs parms in kafkaconfig.



       final SpoutConfig kafkaConf = new SpoutConfig(zkrHosts, kafkaTopic, zkRoot, clientId);
      kafkaConf.ignoreZkOffsets=false;
      kafkaConf.useStartOffsetTimeIfOffsetOutOfRange=true;
      kafkaConf.startOffsetTime=kafka.api.OffsetRequest.LatestTime();
      kafkaConf.stateUpdateIntervalMs=2000;
      kafkaConf.scheme = new SchemeAsMultiScheme(new StringScheme());
      final TopologyBuilder topologyBuilder = new TopologyBuilder();
      topologyBuilder.setSpout("kafka-spout", new KafkaSpout(kafkaConf), 1);
      topologyBuilder.setBolt("print-messages", new MyKafkaBolt()).shuffleGrouping("kafka-spout");
      Config conf = new Config();
      conf.setDebug(true);
      conf.setMaxSpoutPending(1000);
      conf.setMessageTimeoutSecs(30);


      Execute method of bolt



       JSONObject jObj = new JSONObject();
      jObj.put("key", input.getString(0));

      if (null !=jObj && jObj.size() > 0 ) {
      final DBCollection quoteCollection = dbConnect.getConnection().getCollection("stormPoc");
      if (quoteCollection != null)
      BasicDBObject dbObject = new BasicDBObject();
      dbObject.putAll(jObj);
      quoteCollection.insert(dbObject);
      // logger.info("inserted in Collection !!!");
      else
      logger.info("Error while inserting data in DB!!!");

      collector.ack(input);






      mongodb apache-kafka performance-testing apache-storm






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited yesterday







      PPB

















      asked yesterday









      PPBPPB

      922414




      922414






















          1 Answer
          1






          active

          oldest

          votes


















          0














          There is a storm-mongodb module for integration with Mongo. Does it not do the job? https://github.com/apache/storm/tree/b07413670fa62fec077c92cb78fc711c3bda820c/external/storm-mongodb



          You shouldn't use storm-kafka for Kafka integration, it is deprecated. Use storm-kafka-client instead.



          Setting conf.setDebug(true) will impact your processing, as Storm will log a fairly huge amount of text per tuple.






          share|improve this answer






















            Your Answer






            StackExchange.ifUsing("editor", function ()
            StackExchange.using("externalEditor", function ()
            StackExchange.using("snippets", function ()
            StackExchange.snippets.init();
            );
            );
            , "code-snippets");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "1"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55023293%2fstorm-kafka-mongodb-integration%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            There is a storm-mongodb module for integration with Mongo. Does it not do the job? https://github.com/apache/storm/tree/b07413670fa62fec077c92cb78fc711c3bda820c/external/storm-mongodb



            You shouldn't use storm-kafka for Kafka integration, it is deprecated. Use storm-kafka-client instead.



            Setting conf.setDebug(true) will impact your processing, as Storm will log a fairly huge amount of text per tuple.






            share|improve this answer



























              0














              There is a storm-mongodb module for integration with Mongo. Does it not do the job? https://github.com/apache/storm/tree/b07413670fa62fec077c92cb78fc711c3bda820c/external/storm-mongodb



              You shouldn't use storm-kafka for Kafka integration, it is deprecated. Use storm-kafka-client instead.



              Setting conf.setDebug(true) will impact your processing, as Storm will log a fairly huge amount of text per tuple.






              share|improve this answer

























                0












                0








                0







                There is a storm-mongodb module for integration with Mongo. Does it not do the job? https://github.com/apache/storm/tree/b07413670fa62fec077c92cb78fc711c3bda820c/external/storm-mongodb



                You shouldn't use storm-kafka for Kafka integration, it is deprecated. Use storm-kafka-client instead.



                Setting conf.setDebug(true) will impact your processing, as Storm will log a fairly huge amount of text per tuple.






                share|improve this answer













                There is a storm-mongodb module for integration with Mongo. Does it not do the job? https://github.com/apache/storm/tree/b07413670fa62fec077c92cb78fc711c3bda820c/external/storm-mongodb



                You shouldn't use storm-kafka for Kafka integration, it is deprecated. Use storm-kafka-client instead.



                Setting conf.setDebug(true) will impact your processing, as Storm will log a fairly huge amount of text per tuple.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered yesterday









                Stig Rohde DøssingStig Rohde Døssing

                1,741234




                1,741234





























                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55023293%2fstorm-kafka-mongodb-integration%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Identity Server 4 is not redirecting to Angular app after login2019 Community Moderator ElectionIdentity Server 4 and dockerIdentityserver implicit flow unauthorized_clientIdentityServer Hybrid Flow - Access Token is null after user successful loginIdentity Server to MVC client : Page Redirect After loginLogin with Steam OpenId(oidc-client-js)Identity Server 4+.NET Core 2.0 + IdentityIdentityServer4 post-login redirect not working in Edge browserCall to IdentityServer4 generates System.NullReferenceException: Object reference not set to an instance of an objectIdentityServer4 without HTTPS not workingHow to get Authorization code from identity server without login form

                    2005 Ahvaz unrest Contents Background Causes Casualties Aftermath See also References Navigation menue"At Least 10 Are Killed by Bombs in Iran""Iran"Archived"Arab-Iranians in Iran to make April 15 'Day of Fury'"State of Mind, State of Order: Reactions to Ethnic Unrest in the Islamic Republic of Iran.10.1111/j.1754-9469.2008.00028.x"Iran hangs Arab separatists"Iran Overview from ArchivedConstitution of the Islamic Republic of Iran"Tehran puzzled by forged 'riots' letter""Iran and its minorities: Down in the second class""Iran: Handling Of Ahvaz Unrest Could End With Televised Confessions""Bombings Rock Iran Ahead of Election""Five die in Iran ethnic clashes""Iran: Need for restraint as anniversary of unrest in Khuzestan approaches"Archived"Iranian Sunni protesters killed in clashes with security forces"Archived

                    Can't initialize raids on a new ASUS Prime B360M-A motherboard2019 Community Moderator ElectionSimilar to RAID config yet more like mirroring solution?Can't get motherboard serial numberWhy does the BIOS entry point start with a WBINVD instruction?UEFI performance Asus Maximus V Extreme