dcsimg
Login | Register   
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX

By submitting your information, you agree that devx.com may send you DevX offers via email, phone and text message, as well as email offers about other products and services that DevX believes may be of interest to you. DevX will process your information in accordance with the Quinstreet Privacy Policy.


Tip of the Day
Language: Java
Expertise: Beginner
Jun 6, 2018

WEBINAR:

On-Demand

Application Security Testing: An Integral Part of DevOps


Relying on the Default TimeZone

Calendar calendar = new GregorianCalendar();
calendar.setTime(date);
calendar.set(Calendar.HOUR_OF_DAY, 0);
calendar.set(Calendar.MINUTE, 0);
calendar.set(Calendar.SECOND, 0);
Date startOfDay = calendar.getTime();

The code above calculates the start of the day (0h00). The first mistake is the missing millisecond field of the Calendar, but a major mistake is not setting the TimeZone of the Calendar object, as a result, the Calendar will use the default time zone. In a Desktop application this might be fine, but in server-side code this is a big problem. As an example, 0h00 in Tokyo is in a very different moment than in New York, so the developer should check which time zone is relevant for this computation.

Calendar calendar = new GregorianCalendar(user.getTimeZone());
calendar.setTime(date);
calendar.set(Calendar.HOUR_OF_DAY, 0);
calendar.set(Calendar.MINUTE, 0);
calendar.set(Calendar.SECOND, 0);
calendar.set(Calendar.MILLISECOND, 0);
Date startOfDay = calendar.getTime();
Octavia Anghel
 
Comment and Contribute

 

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Sitemap
Thanks for your registration, follow us on our social networks to keep up-to-date