-
Feature Request
-
Resolution: Unresolved
-
Minor
-
None
-
None
-
None
Hi,
I'm currently trying to solve the following problem for the JBeret-SE implementation of the library.
My jobs are running as a standalone JavaSE process, but the environment setup I'm using allows multiple instances of a JVM to run concurrently to perfom their batched tasks.
The issue is that launching multiple instances of the same Jar causes my logger to write to the same file, so that reading a log output is more difficult.
For this reason I modified the logger pattern to include the job execution ID, added a couple of parameters in my MDC and created a ThreadFactory that propagates the the MDC to every child thread. The real issue I found, at this point, is that the execution ID was generated atomically during the submission of the task, and that point it's already to late to influence the execution of the already instantiated thread.
The only solution I came up with was then to extract the sequence of calls performed in the method start(final Job jobDefined, Properties jobParameters, String user) this way
final DelegatingJobOperator operator = DelegatingJobOperator.class.cast(jobOperator); final BatchEnvironment env = AbstractJobOperator.class.cast(operator.getDelegate()).getBatchEnvironment(); final JobRepository repo = env.getJobRepository(); final String appName = getApplicationName(); final Job job = ArchiveXmlLoader.loadJobXml(jobParameters.getProperty(JOB_NAME_KEY), env.getClassLoader(), new ArrayList<Job>(), env.getJobXmlResolver()); repo.addJob(new ApplicationAndJobName(appName, job.getId()), job); JobInstanceImpl instance = repo.createJobInstance(job, appName, env.getClassLoader()); final JobExecutionImpl execution = repo.createJobExecution(instance, jobParameters); MDC.put(MDC_JOB_EXECUTION_KEY, String.valueOf(execution.getExecutionId())); // here's my copy for execution ID to the MDC final JobContextImpl context = new JobContextImpl(execution, null, new ArtifactFactoryWrapper(env.getArtifactFactory()), repo, env); final JobExecutionRunner task = new JobExecutionRunner(context); env.submitTask(task);
but clearly this is a pretty unelegant solution for this kind of problem.
My idea would be then to add a new property to the jberet.properties file, with a reference to a listener class that receives in input the jobContext just before the submission. Something like this (taken from AbstractJobOperator.startJobExecution method):
private long startJobExecution(final JobInstanceImpl jobInstance, final Properties jobParameters, final JobExecutionImpl originalToRestart, final String user) throws JobStartException, JobSecurityException { final BatchEnvironment batchEnvironment = getBatchEnvironment(); final JobRepository repository = getJobRepository(); final JobExecutionImpl jobExecution = repository.createJobExecution(jobInstance, jobParameters); jobExecution.setUser(user); final JobContextImpl jobContext = new JobContextImpl(jobExecution, originalToRestart, new ArtifactFactoryWrapper(batchEnvironment.getArtifactFactory()), repository, batchEnvironment); final JobExecutionRunner jobExecutionRunner = new JobExecutionRunner(jobContext); // invocation to the listener, before jobContext.getBatchEnvironment().getTaskSubmissionListener.beforeSubmit(jobContext); jobContext.getBatchEnvironment().submitTask(jobExecutionRunner); jobContext.getBatchEnvironment().getTaskSubmissionListener.afterSubmit(jobContext); return jobExecution.getExecutionId(); }
Do you think it would make sense? I can propose a change with a pull request on Github, if it's ok with you.