Watch Now

Special Coverage: Port of LA, GE pilot aims to take shipping data from analog to digital

The Port of Los Angeles and General Electric recently launched an ambitious data portal pilot program that will digitize the movement of shipment information in an effort to improve the flow of goods at the nation’s busiest seaport.


In shipping, success boils down to two things: the physical movement of goods and the flow of information needed to move those goods. How well both are conveyed may mean the difference between products arriving at shelves in time for the holiday season and being stranded for weeks in a container yard.
That’s why the industry is closely watching the Port of Los Angeles’ ambitious plan to move that information from analog to digital.
Teaming up with General Electric, officials at the Port of Los Angeles recently launched a data portal pilot program that will digitize the movement of shipping data in an effort to improve the flow of goods at the nation’s busiest seaport.
“It was really about a concept to begin with,” said Port of Los Angeles Executive Director Gene Seroka, who spoke to American Shipper just before the official launch of the pilot program. “What could we do to harness the data elements that exist throughout the supply chain universe but tend to be very siloed? And could the Port of L.A. be that conduit of information? Could we be the centerpiece that allows for sharing information?”

Thinking Big. The world’s two largest container carriers, Maersk Line and Mediterranean Shipping Co. (MSC), will participate in the pilot data project, along with the Port of L.A.’s flagship terminal, APMT at Pier 400, which is operated by Maersk subsidiary APM Terminals. The Home Depot, Lowe’s and other major retailers are also taking part in the project. The port will chart the performance levels of the 2M Alliance’s new TransPacific 6 service based on the information provided by the data portal.
“It’s information of that nature that would allow us really to track this the right way, but more importantly, it allowed us to have visibility to this information much earlier than the industry was seeing it today,” Seroka said. “Information that typically was available 24 to 36 hours before vessel arrival may now be available up to 14 days before the ship comes to Los Angeles.”
The pilot program has been more than two years in the making, picking up steam when the CMA CGM Benjamin Franklin, the largest container ship ever call a Western Hemisphere port at 18,000 TEUs, came to Los Angeles in December 2015.
Port officials were on the phone every night with the origin port, the liner shipping company and the terminal operator, sharing Excel spreadsheets on loading and schematics, trying to find the best way to manage the discharge and loading of cargo from such a big ship.
The port, already operating on a compressed port stay of 56 hours, was able to move that ship out 13 hours ahead of schedule with help from labor, the terminal operator and CMA CGM.
“That really gave us the impetus to say, ‘Now, if we can digitize the information and move that information much faster throughout the supply chain, maybe we could see gains like we saw off the Ben Franklin,’” Seroka said.
Shortly thereafter, port officials began talking to every stakeholder along the supply chain and went through a number of user discovery sessions and focus groups. Former Secretary of Commerce Penny Pritzker even held roundtables with various stakeholders in the supply chain to discuss what they needed and what could be done to make the global supply chain and U.S. ports more efficient and more competitive.
“That was a real big part of the mix, to get firsthand advice from the people who utilize our port and who are prominent within the supply chain,” Seroka said.
The port, which invested about $1.3 million toward the pilot program, put out a request for proposals and found GE to be a great fit.
“They had a lot of the visionary aspects that we were looking for in this project, and we hit it off right away following a very detailed selection process,” Seroka said.

Shifting Focus. GE Transportation, which has traditionally focused mainly on the rail sector, has been looking to expand its digital reach to other parts of the supply chain, including beneficial cargo owners (BCOs).
“When we saw the bid, we thought, ‘What a great place to expand into, especially with the Port of LosAngeles being the largest port in North America,” said Jennifer Schopfer, executive director of customer performance analytics at GE Transportation. “And we thought that we could apply some of our supply chain expertise and knowledge, and also some of the digital solutions that we have in the rail industry. Some of the tools we have in yard planning, in intermodal yard planning, we can really apply those same kinds of concepts and learnings to the port environment.”

“We thought we could
apply some of our supply
chain expertise and
knowledge, and also some
of the digital solutions that
we have in the rail industry
to the port environment.” – Jennifer Schopfer, executive
director of customer performance
analytics, GE Transportation

One of the biggest challenges in creating the program was handling the sheer amount of data across a wide pool of stakeholders and bringing it together in a timely manner, Schopfer said. So GE has applied its multibillion-dollar industrial internet platform PREDIX, a cloud-based operating system that allows for world-class security and quick data ingestion and analytic capabilities, to help lessen the load.
“We’re tapping into over 200 databases and bringing together more than 30,000 data attributes just for this pilot,” she said. “That’s a lot of data. That’s really where our platform comes into play. We have automated ingestion channels and things like that that have helped us, and we have a lot of experience with data curation that helps speed that up.”
The program, which started with a soft launch April 17 and officially launched a month later, includes non-proprietary generic data like the origin and destination of the cargo, what shipping line and what terminal operator will be used as partners, what type of container is being shipped, how many containers, whether it will travel by rail, and where it will go before its final destination.
The system also has been designed with tight security in mind to ensure that competitive information does not go outside the domain of the stakeholder groups taking part of the program and channel access to make sure that beneficial cargo owners who are working with other companies to move their cargo can distribute those rights for access in order to share pertinent information from the system with their railroads, trucking companies and others with granted permission rights, Seroka said.
That would allow for the railroad, trucking operators and chassis providers to plan for where their assets need to be located and what expert personnel will be needed on the job, as well as how best to prepare for the amount of cargo as seamlessly as possible.
“We decided that we would take a forward-leaning perspective and really roll the dice on this kind of new technology and this partnership,” he said. “So I think that really has to do with people that are visionary and willing to take a chance to improve what happens.”

Real Urgency. All this begs the question: Why has it taken so long for the industry to move from analog to digital?
Adam Compain, founder and CEO of ClearMetal, a San Francisco-based tech company that specializes in predictive logistics for global trade, says necessity is the reason.
“The way the industry has solved its problems for the first 20 or 30 years has been building bigger, operational scale,” said Compain, who founded the company in 2014. “As globalization was growing, the way to drive efficiency was to build larger port infrastructure, allow for larger ships to dock at its port, and build scale to increase throughput.”
There was no real urgency to do things differently, until growth slowed.
“Now that there’s so much economic pressure to drive efficiency, there’s a real force to be smarter, not larger,” he said. “How do you offload so many containers on and off a ship? How do we get smarter through software to be more efficient?”
Meanwhile, initial performance results from the Port of Los Angeles’ pilot program are expected in July. GE and port officials are confident the pilot will be successful and are already looking at ways to expand this technology, Seroka said.
“No other port in the nation has attempted to do this,” he said. “And I would attest with my time working overseas in my career, that no other port in the world is taking the approach that we are right now.”
According to Seroka, the pilot program has the potential to be transformational to the industry.
“As we look across technology and how big data continues to play such a role in the advancement of industry, this is going to be another key example,” he said.
Seroka said he’s heard stories of truck dispatchers who have to collect 20 or 30 emails daily just to understand what terminal operations will be like that week, or chassis providers and railroad operators that have to scramble to make sure there’s enough inventory and manpower to meet the demand of larger vessel calls when that information doesn’t come as quickly as they would like.
“It will finally harness all the data points that exist today,” Seroka said. “And if we can get ahead of this and make sure that people have the data that they need to analyze how best to run their business, it will transform the industry.”

Karen Robes Meeks is an independent maritime and logistics journalist. She can reached by email at [email protected]