Skip to content

Commit

Permalink
Add PMD and prohibit unnecessary fully qualified class names in code (a…
Browse files Browse the repository at this point in the history
…pache#4350)

* Add PMD and prohibit unnecessary fully qualified class names in code

* Extra fixes

* Remove extra unnecessary fully-qualified names

* Remove qualifiers

* Remove qualifier
  • Loading branch information
leventov authored and jihoonson committed Jul 17, 2017
1 parent b720351 commit 60cdf94
Show file tree
Hide file tree
Showing 74 changed files with 182 additions and 125 deletions.
6 changes: 6 additions & 0 deletions .idea/inspectionProfiles/Druid.xml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

3 changes: 3 additions & 0 deletions .idea/scopes/NonGeneratedFiles.xml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion api/src/main/java/io/druid/guice/JsonConfigProvider.java
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@
* <h3>Implementation</h3>
* <br/>
* The state of {@code <T>} is defined by the value of the property {@code propertyBase}.
* This value is a json structure, decoded via {@link JsonConfigurator#configurate(java.util.Properties, String, Class)}.
* This value is a json structure, decoded via {@link JsonConfigurator#configurate(Properties, String, Class)}.
* <br/>
*
* An example might be if DruidServerConfig.class were
Expand Down
18 changes: 9 additions & 9 deletions api/src/main/java/io/druid/guice/LifecycleModule.java
Original file line number Diff line number Diff line change
Expand Up @@ -49,8 +49,8 @@ public class LifecycleModule implements Module
* scope. That is, they are generally eagerly loaded because the loading operation will produce some beneficial
* side-effect even if nothing actually directly depends on the instance.
*
* This mechanism exists to allow the {@link io.druid.java.util.common.lifecycle.Lifecycle} to be the primary entry point from the injector, not to
* auto-register things with the {@link io.druid.java.util.common.lifecycle.Lifecycle}. It is also possible to just bind things eagerly with Guice,
* This mechanism exists to allow the {@link Lifecycle} to be the primary entry point from the injector, not to
* auto-register things with the {@link Lifecycle}. It is also possible to just bind things eagerly with Guice,
* it is not clear which is actually the best approach. This is more explicit, but eager bindings inside of modules
* is less error-prone.
*
Expand All @@ -70,8 +70,8 @@ public static void register(Binder binder, Class<?> clazz)
* scope. That is, they are generally eagerly loaded because the loading operation will produce some beneficial
* side-effect even if nothing actually directly depends on the instance.
*
* This mechanism exists to allow the {@link io.druid.java.util.common.lifecycle.Lifecycle} to be the primary entry point from the injector, not to
* auto-register things with the {@link io.druid.java.util.common.lifecycle.Lifecycle}. It is also possible to just bind things eagerly with Guice,
* This mechanism exists to allow the {@link Lifecycle} to be the primary entry point from the injector, not to
* auto-register things with the {@link Lifecycle}. It is also possible to just bind things eagerly with Guice,
* it is not clear which is actually the best approach. This is more explicit, but eager bindings inside of modules
* is less error-prone.
*
Expand All @@ -92,8 +92,8 @@ public static void register(Binder binder, Class<?> clazz, Annotation annotation
* scope. That is, they are generally eagerly loaded because the loading operation will produce some beneficial
* side-effect even if nothing actually directly depends on the instance.
*
* This mechanism exists to allow the {@link io.druid.java.util.common.lifecycle.Lifecycle} to be the primary entry point from the injector, not to
* auto-register things with the {@link io.druid.java.util.common.lifecycle.Lifecycle}. It is also possible to just bind things eagerly with Guice,
* This mechanism exists to allow the {@link Lifecycle} to be the primary entry point from the injector, not to
* auto-register things with the {@link Lifecycle}. It is also possible to just bind things eagerly with Guice,
* it is not clear which is actually the best approach. This is more explicit, but eager bindings inside of modules
* is less error-prone.
*
Expand All @@ -107,15 +107,15 @@ public static void register(Binder binder, Class<?> clazz, Class<? extends Annot
}

/**
* Registers a key to instantiate eagerly. {@link com.google.inject.Key}s mentioned here will be pulled out of
* Registers a key to instantiate eagerly. {@link Key}s mentioned here will be pulled out of
* the injector with an injector.getInstance() call when the lifecycle is created.
*
* Eagerly loaded classes will *not* be automatically added to the Lifecycle unless they are bound to the proper
* scope. That is, they are generally eagerly loaded because the loading operation will produce some beneficial
* side-effect even if nothing actually directly depends on the instance.
*
* This mechanism exists to allow the {@link io.druid.java.util.common.lifecycle.Lifecycle} to be the primary entry point
* from the injector, not to auto-register things with the {@link io.druid.java.util.common.lifecycle.Lifecycle}. It is
* This mechanism exists to allow the {@link Lifecycle} to be the primary entry point
* from the injector, not to auto-register things with the {@link Lifecycle}. It is
* also possible to just bind things eagerly with Guice, it is not clear which is actually the best approach.
* This is more explicit, but eager bindings inside of modules is less error-prone.
*
Expand Down
2 changes: 1 addition & 1 deletion api/src/main/java/io/druid/guice/ManageLifecycle.java
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
/**
* Marks the object to be managed by {@link io.druid.java.util.common.lifecycle.Lifecycle}
*
* This Scope gets defined by {@link io.druid.guice.LifecycleModule}
* This Scope gets defined by {@link LifecycleModule}
*/
@Target({ ElementType.TYPE, ElementType.METHOD })
@Retention(RetentionPolicy.RUNTIME)
Expand Down
2 changes: 1 addition & 1 deletion api/src/main/java/io/druid/guice/ManageLifecycleLast.java
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
/**
* Marks the object to be managed by {@link io.druid.java.util.common.lifecycle.Lifecycle} and set to be on Stage.LAST
*
* This Scope gets defined by {@link io.druid.guice.LifecycleModule}
* This Scope gets defined by {@link LifecycleModule}
*/
@Target({ ElementType.TYPE, ElementType.METHOD })
@Retention(RetentionPolicy.RUNTIME)
Expand Down
2 changes: 1 addition & 1 deletion api/src/main/java/io/druid/guice/PolyBind.java
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ public static <T> ScopedBindingBuilder createChoice(
}

/**
* @deprecated use {@link #createChoiceWithDefault(com.google.inject.Binder, String, com.google.inject.Key, String)}
* @deprecated use {@link #createChoiceWithDefault(Binder, String, Key, String)}
* instead. {@code defaultKey} argument is ignored.
*/
@Deprecated
Expand Down
3 changes: 1 addition & 2 deletions api/src/main/java/io/druid/timeline/DataSegmentUtils.java
Original file line number Diff line number Diff line change
Expand Up @@ -67,8 +67,7 @@ public Interval apply(String identifier)
*
* @param dataSource the dataSource corresponding to this identifier
* @param identifier segment identifier
* @return a {@link io.druid.timeline.DataSegmentUtils.SegmentIdentifierParts} object if the identifier could be
* parsed, null otherwise
* @return a {@link DataSegmentUtils.SegmentIdentifierParts} object if the identifier could be parsed, null otherwise
*/
public static SegmentIdentifierParts valueOf(String dataSource, String identifier)
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@
package io.druid.common.aws;

import com.amazonaws.AmazonClientException;
import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.AWSCredentialsProvider;
import com.google.common.base.Strings;

Expand All @@ -32,10 +33,10 @@ public ConfigDrivenAwsCredentialsConfigProvider(AWSCredentialsConfig config) {
}

@Override
public com.amazonaws.auth.AWSCredentials getCredentials()
public AWSCredentials getCredentials()
{
if (!Strings.isNullOrEmpty(config.getAccessKey()) && !Strings.isNullOrEmpty(config.getSecretKey())) {
return new com.amazonaws.auth.AWSCredentials() {
return new AWSCredentials() {
@Override
public String getAWSAccessKeyId() {
return config.getAccessKey();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@

package io.druid.common.aws;

import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.AWSCredentialsProvider;

public class LazyFileSessionCredentialsProvider implements AWSCredentialsProvider
Expand All @@ -42,7 +43,7 @@ private FileSessionCredentialsProvider getUnderlyingProvider() {
}

@Override
public com.amazonaws.auth.AWSCredentials getCredentials()
public AWSCredentials getCredentials()
{
return getUnderlyingProvider().getCredentials();
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -53,13 +53,13 @@ public boolean needToSplit(Node node)
*
* Algorithm Split. Divide a set of M+1 index entries into two groups.
*
* S1. [Pick first entry for each group]. Apply Algorithm {@link #pickSeeds(java.util.List)} to choose
* S1. [Pick first entry for each group]. Apply Algorithm {@link #pickSeeds(List)} to choose
* two entries to be the first elements of the groups. Assign each to a group.
*
* S2. [Check if done]. If all entries have been assigned, stop. If one group has so few entries that all the rest
* must be assigned to it in order for it to have the minimum number m, assign them and stop.
*
* S3. [Select entry to assign]. Invoke Algorithm {@link #pickNext(java.util.List, Node[])}
* S3. [Select entry to assign]. Invoke Algorithm {@link #pickNext(List, Node[])}
* to choose the next entry to assign. Add it to the group whose covering rectangle will have to be enlarged least to
* accommodate it. Resolve ties by adding the entry to the group smaller area, then to the one with fewer entries, then
* to either. Repeat from S2.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,7 @@ public void testIntOverflow() throws IllegalAccessException, InstantiationExcept
IntegerSet integerSet = IntegerSet.wrap(wrappedBitmap);
integerSet.add(Integer.MAX_VALUE + 1);
}
catch (java.lang.IllegalArgumentException ex) {
catch (IllegalArgumentException ex) {
e = ex;
}
Assert.assertNotNull(e);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@

import com.google.common.base.Throwables;
import io.druid.java.util.common.ISE;
import org.apache.logging.log4j.core.LifeCycle;
import org.apache.logging.log4j.core.util.Cancellable;
import org.apache.logging.log4j.core.util.ShutdownCallbackRegistry;

Expand All @@ -29,7 +30,7 @@
import java.util.concurrent.ConcurrentLinkedQueue;
import java.util.concurrent.atomic.AtomicBoolean;

public class Log4jShutdown implements ShutdownCallbackRegistry, org.apache.logging.log4j.core.LifeCycle
public class Log4jShutdown implements ShutdownCallbackRegistry, LifeCycle
{
private static final long SHUTDOWN_WAIT_TIMEOUT = 60000;

Expand Down
5 changes: 2 additions & 3 deletions common/src/main/java/io/druid/common/utils/VMUtils.java
Original file line number Diff line number Diff line change
Expand Up @@ -49,9 +49,8 @@ public static long safeGetThreadCpuTime()
*
* @return total CPU time for the current thread in nanoseconds.
*
* @throws java.lang.UnsupportedOperationException if the Java
* virtual machine does not support CPU time measurement for
* the current thread.
* @throws UnsupportedOperationException if the Java virtual machine does not support CPU time measurement for
* the current thread.
*/
public static long getCurrentThreadCpuTime()
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@
import java.io.IOException;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.io.Serializable;
import java.nio.ByteBuffer;
import java.util.ArrayList;
import java.util.Arrays;
Expand Down Expand Up @@ -61,7 +62,7 @@
* @author Alessandro Colantonio
* @version $Id$
*/
public class ConciseSet extends AbstractIntSet implements java.io.Serializable
public class ConciseSet extends AbstractIntSet implements Serializable
{
/**
* generated serial ID
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@
import com.google.common.collect.UnmodifiableIterator;
import com.google.common.primitives.Ints;
import io.druid.extendedset.utilities.IntList;
import org.roaringbitmap.IntIterator;

import java.nio.ByteBuffer;
import java.nio.IntBuffer;
Expand Down Expand Up @@ -1000,7 +1001,7 @@ public WordIterator newWordIterator()
return new WordIterator();
}

public class WordIterator implements org.roaringbitmap.IntIterator, Cloneable
public class WordIterator implements IntIterator, Cloneable
{
private int startIndex;
private int wordsWalked;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
import org.apache.hadoop.metrics2.sink.timeline.TimelineMetric;

/**
* Emits all the events instance of {@link com.metamx.emitter.service.ServiceMetricEvent}.
* Emits all the events instance of {@link ServiceMetricEvent}.
* <p>
* All the dimensions will be retained and lexicographically order using dimensions name.
* <p>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@
package io.druid.storage.cassandra;

import com.fasterxml.jackson.core.Version;
import com.fasterxml.jackson.databind.Module;
import com.google.common.collect.ImmutableList;
import com.google.inject.Binder;
import com.google.inject.Key;
Expand Down Expand Up @@ -54,10 +55,10 @@ public void configure(Binder binder)
}

@Override
public List<? extends com.fasterxml.jackson.databind.Module> getJacksonModules()
public List<? extends Module> getJacksonModules()
{
return ImmutableList.of(
new com.fasterxml.jackson.databind.Module()
new Module()
{
@Override
public String getModuleName()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
import java.util.concurrent.TimeUnit;

/**
* Emits all the events instance of {@link com.metamx.emitter.service.ServiceMetricEvent}.
* Emits all the events instance of {@link ServiceMetricEvent}.
* <p>
* All the dimensions will be retained and lexicographically order using dimensions name.
* <p>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,7 @@
import org.apache.orc.CompressionKind;
import org.apache.orc.OrcFile;
import org.apache.orc.TypeDescription;
import org.apache.orc.Writer;
import org.joda.time.DateTime;
import org.joda.time.DateTimeComparator;
import org.joda.time.Interval;
Expand Down Expand Up @@ -141,7 +142,7 @@ private File writeDataToLocalOrcFile(File outputDir, List<String> data) throws I
.addField("host", TypeDescription.createString())
.addField("visited_num", TypeDescription.createInt());
Configuration conf = new Configuration();
org.apache.orc.Writer writer = OrcFile.createWriter(
Writer writer = OrcFile.createWriter(
new Path(outputFile.getPath()),
OrcFile.writerOptions(conf)
.setSchema(schema)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -126,7 +126,7 @@ public class SQLServerConnector extends SQLMetadataConnector
* <p>
*
* @see <a href="https://github.com/spring-projects/spring-framework/blob/v4.3.2.RELEASE/spring-jdbc/src/main/java/org/springframework/jdbc/support/SQLStateSQLExceptionTranslator.java">Spring Framework SQLStateSQLExceptionTranslator</a>
* @see java.sql.SQLException#getSQLState()
* @see SQLException#getSQLState()
*/
private final Set<String> TRANSIENT_SQL_CLASS_CODES = new HashSet<>(Arrays.asList(
"08", "53", "54", "57", "58", // Resource Failures
Expand Down Expand Up @@ -265,7 +265,7 @@ public DBI getDBI()
*
* {@inheritDoc}
*
* @see java.sql.SQLException#getSQLState()
* @see SQLException#getSQLState()
*
*/
@Override
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
*/
package io.druid.metadata.storage.sqlserver;

import com.fasterxml.jackson.databind.Module;
import com.google.common.collect.ImmutableList;
import com.google.inject.Binder;
import com.google.inject.Key;
Expand All @@ -44,7 +45,7 @@ public SQLServerMetadataStorageModule()
}

@Override
public List<? extends com.fasterxml.jackson.databind.Module> getJacksonModules()
public List<? extends Module> getJacksonModules()
{
return ImmutableList.of();
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@
package io.druid.data.input;

import com.fasterxml.jackson.databind.DeserializationFeature;
import com.fasterxml.jackson.databind.Module;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializationFeature;
import com.google.common.base.Function;
Expand Down Expand Up @@ -121,7 +122,7 @@ public void before()
{
jsonMapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false);
jsonMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
for (com.fasterxml.jackson.databind.Module jacksonModule : new AvroExtensionsModule().getJacksonModules()) {
for (Module jacksonModule : new AvroExtensionsModule().getJacksonModules()) {
jsonMapper.registerModule(jacksonModule);
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@ public void close(String namespace)
@Override
public io.druid.client.cache.CacheStats getStats()
{
final com.github.benmanes.caffeine.cache.stats.CacheStats stats = cache.stats();
final CacheStats stats = cache.stats();
final long size = cache
.policy().eviction()
.map(eviction -> eviction.isWeighted() ? eviction.weightedSize() : OptionalLong.empty())
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,7 @@
import kafka.consumer.KafkaStream;
import kafka.consumer.Whitelist;
import kafka.javaapi.consumer.ConsumerConnector;
import kafka.javaapi.consumer.ZookeeperConsumerConnector;
import kafka.message.MessageAndMetadata;
import kafka.serializer.Decoder;

Expand Down Expand Up @@ -294,7 +295,7 @@ public void onFailure(Throwable t)
// Overridden in tests
ConsumerConnector buildConnector(Properties properties)
{
return new kafka.javaapi.consumer.ZookeeperConsumerConnector(
return new ZookeeperConsumerConnector(
new ConsumerConfig(properties)
);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,7 @@
import kafka.utils.Time;
import kafka.utils.ZKStringSerializer$;
import org.I0Itec.zkclient.ZkClient;
import org.I0Itec.zkclient.exception.ZkException;
import org.apache.curator.test.TestingServer;
import org.apache.zookeeper.CreateMode;
import org.joda.time.DateTime;
Expand Down Expand Up @@ -199,7 +200,7 @@ public void close() throws Exception
try {
zkClient.deleteRecursive(zkKafkaPath);
}
catch (org.I0Itec.zkclient.exception.ZkException ex) {
catch (ZkException ex) {
log.warn(ex, "error deleting %s zk node", zkKafkaPath);
}
}
Expand Down
Loading

0 comments on commit 60cdf94

Please sign in to comment.