IT Software Operations Unit

Japan

IT Software Operations Unit

Japan
SEARCH FILTERS
Time filter
Source Type

Hayashi T.,System IP Core Research Laboratories | Ueno H.,IT Software Operations Unit
NEC Technical Journal | Year: 2010

As communication uses broader bandwidths and communication needs become diversified, a wide variety of equipment is introduced into network infrastructures. In the cloud computing environment, it is essential to support services that frequently occur and the changing requirements. The dynamic reconfiguration technology realizes the network nodes with excellent performance scalability and functional scalability supporting the cloud computing environment. Further, this technology is also applicable to the middlebox virtualization enabling performance scale-out and system optimization.


Kato K.,IT Software Operations Unit | Nishimura T.,IT Software Operations Unit | Katsumi J.,NEC Soft Ltd.
NEC Technical Journal | Year: 2010

Optimization of IT costs and the "cloud computing service" that is expected to deal quickly and flexibly with changes in future business procedures are currently attracting attention. This paper introduces the requirements of the data center operations management methods that form the foundation for the use by enterprises of cloud services. It also introduces a method of data center implementation using "MasterScope," integrated operations management software provided by NEC. This method is particularly helpful for providing a performance analysis technique that is a key point in supporting the stable provision of services by minimizing system downtime. This paper also introduces the latest technologies for solving issues resulting from "silent faults" that are difficult to solve using traditional technologies.


Oosawa H.,IT Software Operations Unit | Miyata T.,IT Software Operations Unit
NEC Technical Journal | Year: 2012

The era of big data is characterized by an increasing need to create new value and new business through the realtime processing of large amounts of data. This processing requires an increase in individual data processing speeds as well as high throughput. This paper introduces the InfoFrame Table Access Method, a memory DB product suitable for real-time big data processing thanks to its high-speed parallel data processing capability.


Kawabata T.,IT Software Operations Unit | Hamada M.,IT Software Operations Unit | Tamura M.,IT Software Operations Unit | Hakuba T.,IT Software Operations Unit
NEC Technical Journal | Year: 2012

The age of the information explosion has recently been raising various opportunities for big data analysis. On the other hand, the progress in hardware developments has seen a significant increase in the capacities of mountable memories. The InfoFrame DataBooster has been meeting the needs for high-speed processing of big data by using an in-memory data processing technology based on column store. This paper introduces differences between the regular RDB and the InfoFrame DataBooster. It goes on to discuss the background and method of development of the SQL interface of the latest version, the method of use of the InfoFrame Databooster and the application domains (case studies).


Kawanabe M.,IT Software Operations Unit | Yoshimura S.,IT Software Operations Unit | Utaka J.,IT Software Operations Unit | Yoshioka H.,IT Software Operations Unit | And 2 more authors.
NEC Technical Journal | Year: 2012

With the rapid increase in corporate data that must be stored long-term, such as the backup of management information and the archiving of e-mails with customers, the need for safe, easy storage of big data has been rising higher than ever. HYDRAstor is a grid storage system that meets these needs. Adopting a revolutionary grid architecture to achieve high performance, scalability and reliability as well as operation/management labor saving, it is suitable for the storage of big data. This paper describes the outline and features of the technologies used in HYDRAstor and introduces actual cases in which it is used.


Kato K.,IT Software Operations Unit | Yabuki K.,IT Software Operations Unit
NEC Technical Journal | Year: 2012

NEC's MasterScope middleware is an integrated operation management software suite. MasterScope collects operational and performance metrics from target IT systems and analyzes them comprehensively in order to detect and locate system failures. Detected events and failures will be notified to the operator and workarounds can be applied to recover from failures. There are similarities between such an analysis process for the operation management and that required to process big data. For Instance, the system performance analysis software "MasterScope Invariant Analyzer" automatically discovers important correlations from a large amount of performance data and proactively detects hidden performance anomalies, thereby avoiding serious system level damages. This paper describes the analysis technology of MasterScope which has similarity to the big data analysis technology, and then introduces experimental applications of the system invariant analysis technology in domains other than the operation management.


Sukenari T.,IT Software Operations Unit | Tamura M.,IT Software Operations Unit
NEC Technical Journal | Year: 2012

Existing relational databases experience problems when used with big data because they cannot deal flexibly with increases in the amount of data and number of accesses (scaling out). On the other hand, key-value store is an advanced technology but it is also troubled by the problem of lacking data access with SQL and the transaction processing required for mission-critical operations. The InfoFrame Relational Store (IERS) is a scale-out-capable database software optimal for big data utilization equipped with 1) SQL interfacing, 2) transaction processing and 3) high reliability that makes it applicable to mission-critical operations. This paper introduces the features of the IERS and its architecture.


Muroi Y.,IT Software Operations Unit | Mukai Y.,IT Software Operations Unit
NEC Technical Journal | Year: 2012

The information explosion is becoming a real issue, and the amount of information stored on file servers is continuously bloating, making the identification, organization and utilization of information on file servers difficult jobs. The latest version V2.1 of the Information Assessment Tool, a tool for "visualization," "slimming," "activation" and "optimization" of file servers, adopts the InfoFrame DataBooster high-speed data processing engine to deal with large-scale file servers and enable interactive analysis based on high-speed search/aggregation.


Katou M.,IT Software Operations Unit
NEC Technical Journal | Year: 2010

The trend toward reductions in IT investments due to the current economic climate has tended to focus our attention on cloud computing because it does not require initial capital investment. Nevertheless, even in the cloud environment, the needs of business system infrastructures are essentially the same. The WebOTX operational history extends over more than a decade as a service execution platform that can effectively execute business systems. This paper describes its main features of high reliability and operability, together with a description of the function enhancements that are featured in the latest version.


Kudo M.,IT Software Operations Unit
NEC Technical Journal | Year: 2012

Big data multiplies and changes every day, and the quantity of data and the required computing/processing capability vary between projects. To process big data efficiently, it is necessary to locate the required ICT resources dynamically and scalably in optimum placement. This paper describes the features and effectiveness of ProgrammableFlow that optimizes computer and network resources dynamically using OpenFlow/Software Defined Network (SDN) technology.

Loading IT Software Operations Unit collaborators
Loading IT Software Operations Unit collaborators