04 Aug 2017, 10:10

MongoDB POJO Support

I’m really pleased to announce that version 3.5.0 of the MongoDB Java Driver has been released with POJO (Plain Old Java Object) support!

Codecs

MongoDB uses BSON, a binary super set of JSON, for its wire protocol and storage format. The 3.0 series of the Mongo Java Driver introduced Codecs - an improved way of translating these BSON into the native Java objects eg: Document or BsonDocument.

Codecs are an abstraction that determine how BSON data is converted and into what type. As an abstraction, it can be quite verbose to write your own custom POJO Codecs. As each POJO requires a Codec implementation to be registered in the CodecRegistry. The amount of code required to support an application with tens of POJOs was often seen as a barrier to entry.

However, the benefits of using Codecs for handling your POJOs were numerous. It could easily simplify your main application code, as POJOs can map directly to the domain, making the code easier to reason. Another benefit is speed, Codecs can negate the need to use an intermediate map-like object before hyrdating your domain object. For this reason, it has been a long requested feature to make the creation of Codecs from POJOs automatic.

PojoCodecProvider

31 Mar 2017, 13:15

MongoDB Scala Driver 2.0 released

The 2.0.0 version of the official Scala Driver for MongoDB has been released!

Case Class support

I’m really happy to announce the introduction of case class support, making it much easier to use your domain models with MongoDB. Internally Codecs are used to convert datatypes to and from BSON - the internal data format for MongoDB. The 2.0 release includes a Macro that can create a codecs from case classes.

06 Sep 2016, 17:10

MongoDB Scala Connector releases!

Version 1.1.0 of the MongoDB Spark connector has been released. As well as the MongoDB Spark Connector 2.0.0-rc0, bring Spark 2.0 support.

1.1.0

This is the first release after the 1.0.0 driver and contains some API improvements and updates based on feedback from users. Many thanks to all those that have provided feedback either through the MongoDB User mailing list, via StackOverflow or via the Spark Jira project.

It’s been thrilling to get such great feedback and find out about some of the real world scenarios the connector has been used for. One of my favourites so far has been about how China Eastern Airlines and how they use the connector to save time and money. But wether you’re a big or small user of the connector, I’d really appreciate your feedback and comments. It really is central to making this connector even better and more accessible.

Improvements in 1.1.0

27 Jun 2016, 22:10

MongoDB Scala Connector Released

The new MongoDB Spark connector has been released!

Last month I announced that the new Spark connector for MongoDB was in beta. After some invaluable testing by the community, I’m excited to announce that the first official release is now available from spark-packages:

> $SPARK_HOME/bin/spark-shell --packages org.mongodb.spark:mongo-spark-connector_2.10:1.0.0

A clean, simple connector.

At MongoDB we’ve been listening to your feedback about what you would like from a new mongodb connector. With that in mind we’ve written a totally new idiomatic connector for spark:

18 May 2016, 13:43

Introducing a new MongoDB Spark Connector

Update!

The MongoDB Spark connector has been released! See the official documentation for more information on getting started!


Following on from the official announcement yesterday, I’m really excited to write a few words about new MongoDB Spark Connector.

Getting started

Before I go into detail about the hows and whys, first have a look at a quick usage example:

20 Nov 2015, 12:06

El Capitan and key_load_public: invalid format
I upgraded to El Capitan on my Mac and it all went smoothly apart from one hitch. When pulling or pushing to Github I would get the following error: key_load_public: invalid format. Everything worked but opaque error messages are annoying. I tried googling and all I found was this article. It seemed to solve the issue but I didn’t like the solution, mainly because it didn’t help me understand the problem!

20 Oct 2015, 13:10

MongoDB Scala Driver Released
The new Scala Driver for MongoDB has been released! Last month I announced the first release candidate of a new idiomatic Scala Driver for MongoDB and I’m excited to announce that the first official release is now available on sonatype for Scala 2.11: "org.mongodb.scala" %% "mongo-scala-driver" % "1.0.0" A clean, simple Scala driver. At MongoDB we’ve been listening to your feedback about what you would like from a new Scala driver.

23 Sep 2015, 13:00

Introducing a new MongoDB Scala Driver
Update - now released! I’m really pleased to announce the first release candidate of a new MongoDB Scala Driver! Insider information At MongoDB we’ve been really busy, back in April we introduced the 3.0 Java driver. It was a massive undertaking that included numerous improvements and updates. What got me most excited with the 3.0 release was the introduction of a new fully asynchronous, non-blocking driver. Using this asynchronous driver as a base we also released an RxJava driver and a Reactive Streams driver.

23 Apr 2014, 00:00

How to: Handle multiple Scala versions

I recently upgraded Casbah to support the latest Scala 2.11 release and for the first time when supporting multiple Scala versions I hit a stumbling block. If you’re writing a library that wants to support multiple versions of Scala in a single code base, it’s generally easy isn’t it? Thankfully, it is as sbt can do the heavy lifting for you.

Three steps to success

The sbt documentation covers the basics nicely in their cross build documentation. But what’s the path to success?

06 Nov 2013, 00:00

RxJava - understandably reactive

Reactive programing is hot stuff at the moment and the Coursera Principles of Reactive Programming course has just started (its not too late to enroll).

Recently, I’ve been hearing good things about RxJava ( a port of .Net’s Reactive extensions ) so I wanted to learn some more. Then I stumbled upon a video from a recent SF Scala meetup which covered what it is and how they implemented the core and then added support for other JVM languages.